In the past we had server side technologies like servlets and JSP pages and the client side technologies like JavaScript for web development. The common characteristic of these technologies was that a web page was redisplayed in its entirety if any change needed to be communicated by the server. Asynchronous JavaScript and XML (AJAX) is the extension of various web technologies to facilitate client-server communication without reloading the current page. The XMLHttpRequest object enabled communication with server-side scripts from within JavaScript to allow one to update portions of a page based upon user events. A good example of
Sunday, December 28, 2008
What is AJAX?
Saturday, December 27, 2008
Heroic programming is inadequate
Risk management through iteration
The risk manifests itself in various ways – the risk attached to not meeting a business deadline or delivery, the risk at all stages of development cycle like poor understanding of requirements, inflexible and unresponsive design, developers unfamiliarity with problem domain or even new technology, lack of availability of required resources, attritional team dynamics etc, the risk attached to budgetary, organisational and time constraints, the risk stemming from size, complexity, change and so on. Mitigating these risks and delivering a system to budget, on time and meeting the specification is the essence of project management so, unsurprisingly, risk-management should be a core concept in development methodologies.
Iteration refines understanding through feedback and eliminates risk. The completeness and accuracy of requirement capture can be checked through prototyping thus reduces risk of project failure. The cost of early discovery is cheap. Algorithm, workflow, human-computer interface, alternative designs, stress-testing, package suitability etc can be prototyped and checked against objectives till they meet the need. The complexity and largeness never becomes overwhelming because of multiple development cycles in iterative approach. The final solution shouldn’t come as a surprise to users as the solution would have been implemented in stages and with user approval.
Friday, December 26, 2008
Challenges of cloud computing
The readiness to veer in the direction of cloud computing in the current economic climate is understandable. It makes sense to trade fixed cost for variable cost and shift from capital expenditure to operational expenditure. There are a number of CRM and other application players online which can readily meet our functional needs. So opting for cloud computing is a non-brainer but it is not a panacea. Apart from the normal operational concerns about training, regulatory compliance, security, connectivity, fragmented nature of offerings etc., a big question remains about populating the data from our existing systems into these cloud offerings. We need to clearly think through as to how we are going to do the bulk data updates and, eventually, extract the data from these cloud offerings for analysis in other systems. The unavailability of API can further cause complications in integrating with other systems. The problem is compounded if we have to move the data from cloud-based offering to another.
A useful chart for understanding the landscape of cloud computing gives a clear understanding of the players and the utilities, services and applications they offer.
There are many similarities between the SaaS offerings of today and the bureaux services of yore but we should appreciate the difference. Computer bureaux date back to the era when computers were expensive and batch processing and dumb terminals multiplexed into central mainframe to maximise usage were the norm. A discrete service like payroll or accounting systems was offered on a centralised server on time-sharing basis for multiple-clients and the client data was transferred via magnetic tapes and disks for batch processing. The charging was usage-based for the computer time needed. The SaaS offerings provide web-based, installation-free access to managed services on centralised hosts providing integrated applications like enterprise resource planning systems or customer relationship management (SAP, Salesforce etc). The latter are truly distributed offerings, whereby data from the central repository could be manipulated on the local PC. The charging is normally based on user population and concurrent users. The motivation in 60-70s was sharing expensive resources but nowadays concerns like availability, scalability, reliability and security are paramount. The disjointed, slow, batch and cumbersome approach of the bureaux has acquired 24x7 availability, responsiveness and seamless-integration in SaaS world. The whole burden of performing license-management, version-control, resilient-configuration, secure-access, disaster recovery etc is devolved on ASP which is more complex nowadays. We have moved from pseudo-parallelism of bureau to distributed, concurrent environment of SaaS.
The complexity involved in deploying and upgrading software in distributed environment, the consequential difficulties in negotiating relevant licenses, the interoperability issues, ubiquity of browser-based client, fast-and-cheap communication, affordable scalability, the trend towards outsourcing etc have all aided the drive towards SaaS. Hardware and software technology is seen as purchasable commodity and the organisations prefer to concentrate on their core competencies, expecting secure and resilient service from experts. The ASPs also feel confident that benchmarks exist to provide requisite concurrency and performance from their server farms, allowing them to focus on their domain-expertise. This approach is cheaper than in-house solution. However, careful operational planning is required to make it a success.