Of late, I have been visiting IT deployments in many small and medium organizations — in the private sector as well as the Government sector. During the process, I had the opportunity to study initiatives taken by the business, as well as the IT support necessary to make business succeed.
In many such cases, the IT group helped in making the organization more efficient; enabling them to respond well to their customers and successfully roll out e-Governance initiatives. Some of them appeared very successful ventures, and showed a lot of promise. In the case of a few others, I did not find the IT response to be adequate. Though everyone seemed happy with the situation, a closer look gave a feeling that the gains were not sustainable given the inadequacy of the IT infrastructure that supported these initiatives.
Let me explain the issue further. A meaningful IT support to business begins with a plan, a strategy and a definition of a roadmap for the long run. Sometimes this is ignored in the interest of immediate gains, and at other times this aspect is not properly understood. I will list out a few situations to drive home the point.
Piecemeal solutions over time: This is a usual phenomenon noticed in many situations. Each requirement expressed is converted into a system, and programs written to roll out for implementation. Several systems then get developed, often by different programmers, and on a variety of platforms. Whenever the need for an interface arises, some element of data passing or a loose integration through Web services makes everyone happy. Little do they realize that it is these ad-hoc solutions that lead to an avoidable mess as the needs expand, and several solutions get developed.
Total reliance on in-house staff: In several cases, people take pride in announcing that all systems have been developed in-house, and that there is no external input. The impression given is that they have saved costs for the organization, and that the internal staff is good enough to handle all organizational needs. The trouble in such cases is that the group’s knowledge does not grow, and they keep doing what they know. This aspect is reflected in the way the solutions are developed, in the way hardware platforms are chosen, and the manner in which systems are written. The methods at times are outdated, and lack a contemporary approach.
Lack of participation at the management level: Unless the IT head is senior enough and participates in business discussions, the solutions will always be ad-hoc, and lack a long term vision. I have met a few managers who were fully involved in the business initiatives, and were well aware of the business directions and goals in focus. Others were happy to play a background role — the solutions they developed were suspect, as they could have not have held when the business expanded or when the situations got complex. Such an approach remains to be one of a programmer, who only looks for an opportunity to write a program.
Make do with small IT teams: I was surprised to hear from a few Government departments about their achievements in spite of being a small team. They spoke of economizing on budgets and about outsourcing. I thought many of them did not understand the importance of IT, and thought that by outsourcing system development they had done the right thing. There was still no IT direction and their moves were dictated by the appointed vendors.
From all these situations, I learnt that managements which drive such initiatives should ensure that IT solution infrastructure is aligned to organizational growth; that built IT platforms will last for the next few years. It is also important to ensure that the technology is suitably updated, so that IT gives them an edge, and that the organization benefits from technology usage.
We have been speaking about going paperless for many years now. People initially were very skeptical and said that this can never be achieved. However, as technology progressed, a paperless scenario looked more real. The advent of e-mail facilitated putting many a memo, word files, and excel sheets in e-forms, thus, avoiding a print on paper. However, the drawback was that it left a lot of organizational information scattered, and worse, lying on individual PCs.
I have experimented with technologies like document and content management, work flow, microfilming, etc. which have brought in a lot of relief and the attendant advantages. The implementation journey and the transition were however not easy and we had to face several hurdles. But even today it remains quite a challenge and it certainly is not a cakewalk in most places. Let us look at some of the factors that hinder progress on this front.
Removing the roadblocks
1. Breaking the habit: People who have been used to keeping papers in files cannot easily give up the habit of storing records physically. They still print the documents with the plea that they are uncomfortable reading large documents on screen.
2. Inertia in classifying and organizing documents: We usually ask users to group and classify their documents subject-wise so that they can be organized and stored in a central repository. Many do not cooperate saying that their classification changes often as they handle new subjects often. Some are so used to creating ad-hoc directories in their PCs that they are not amenable to taking a holistic view and creating a new order. This one stage I have seen taking inordinately long.
3. Plea of flexibility: Keeping records with themselves seems so easy that they argue against centralization. Organized filing centrally will obviously entail following of rules and an end to ad-hoc modification and deleting of files. Therefore they claim a loss of flexibility.
4. Perceived loss of control: If we have the records with us, we experience a sense of power. People who want information would ask us and that gives us a feeling of importance. If I am keeping the records, my boss will have to call me for information and would be dependent on me. Agreeing to move the records to a central location would amount to giving up my rights.
5. Resisting destruction of physical records: This was another challenge that I had to contend with. Even when we had scanned documents and lodged them into the document management system, users were reluctant to destroy their old physical records. It required a lot of persuasion. Similarly, when we had converted old records into microfilms with an additional copy as a back-up, users resisted destruction of physical records. I then had to temporarily halt further conversion and wrote to the management seeking directions. Then came the diktat for destruction of records and that enabled the company give up the hired document storage-warehouses thus bringing down expenses.
Though a move towards a paperless environment is a reality, it still faces roadblocks and these need to be handled well and with certain measure of firmness. In most cases it is about instilling a discipline and bringing about an order. Once users experience the benefits of electronic handling of documents, they push for more and never look back.
If we are convinced about the usefulness of microfilming the old records, the next step will be to work out a justification and make up a case for introducing this in the organization. So let us discuss the steps that we need to go through.
Have a valid financial justification
The first step is to identify documents which can be candidates for microfilming. For example, old accounting records that are, say, more than three years old (i.e. post statutory audit or those not required for MIS purposes), can be taken up. This would include all ledgers, vouchers, and other supporting documents. Other examples would be records related to Sales Tax, Excise, or old shareholder transactions. We need to emphasize on the advantages to the users like freeing up of storage space and ease of access. In the organization that I put this system in, we could give up a storage space hired for keeping these records and we could access old records which was an uphill task earlier.
Our old documents, even if archived, reside on magnetic tapes, cartridges, low cost disks, etc.; but these are expensive and less reliable as these media deteriorate over long periods of time. Microfilms on the other hand are less expensive and have much longer life. Financial justification can therefore be easily worked out.
Organizations usually start with converting old documents in physical form as it helps in converting the bulk of documents into a single film tape / cartridge. The process is to be done carefully and it is best to outsource this activity to external agencies who are experts in this process. You have first to classify documents, number them, and then convert them in the right sequence. You can classify them on the basis of document type, year, etc. When converted, all tapes will have to be properly indexed and labeled. It is necessary to exercise control to ensure that documents are not missed out and also that they are not duplicated.
To ensure authenticity, the recording starts with a document signed by the authorized person and similarly a document at the end signals the end of recording. Such a microfilmed record therefore cannot be tampered with. These records are accepted as evidences by various statutory authorities.
In order to access and read these tapes we would have to buy a film reader which converts the tiny films into a readable form through display on a screen attached to the reader. These machines are relatively inexpensive and therefore affordable.
Say goodbye to the old habits
After having converted our old physical records on films, the next step will be to avoid creation of new documents in physical form and so that we do not go through the entire grind once again. For example many organizations have stopped printing general / sub ledgers but may take one copy at the year end. It is here that we need to bring about a change – why not transfer the ledger directly from magnetic media to microfilm tapes. The technology available today makes this possible and so it is best to use this interface. The same principle can be applied to various documents that need to be held over long periods of time.
So here is a simple technology which can be gainfully used to solve a part of our storage problems. The process may seem difficult in the beginning but after the initial conversion, the ongoing process falls into a routine and can be easily managed.
Microfilming of old records is an area which has largely been ignored by CIOs. Apart from a few companies in the banking and financial sector and some government departments, most have not made use of this technology. The decision to microfilm old documents has largely been taken by functional departments; but the CIO, in my opinion, can take a lead here and explore application of this solution in his organization.
What is Microfilming?
Microfilms are films containing micro-reproductions of documents for transmission, storage, reading, and printing. These are essentially photographs of documents and are images commonly reduced to about one twenty-fifth of the original document size.
Microfilming, also called microphotography, consists in the reduction of images to such a small size that they cannot be read without optical assistance. Such a photographic compression often results in a ninety-nine percent saving of space. With the advancement in the field of documentary reproduction, the function of this facility is not only restricted to storage but also classification, and retrieval. The usefulness of this medium is significant as many documents deteriorate over time because of the poor quality of paper and print.
Common use of microfilms
Microfilming is a widely used practice in the government as also in banks and financial institutions. The huge volume of public records in government and customer records in banks, for example, make up a good case for microfilming.
In companies, this has been used for storing several documents that are statutorily required to be kept for several years. For example, accounting records are required to be preserved for eight years by the Companies Act and the Income Tax Act. Similarly other Acts such as Central Excise Act, Sales Tax Act etc. have their own stipulations. Some companies have also used for this medium for preserving old legal files, employee personnel records, customer records etc.
Safety and security
Besides saving space, the most important feature of this medium is document integrity and information security. Preservation of rare and deteriorating documents is considered one of the most important purposes in micro-recording. Valuable rare documents are now being microfilmed to preserve them from loss and destruction. Moreover it must be protected against loss, which would be irrecoverable in the case of valuable documents, records or rare books. Several duplicates of microfilmed documents can be made available, while the original documentation may be kept in archival storage or may, in fact, be destroyed. For additional security, negatives and positives can be stored in different places; being of small bulk, it can be specially protected. The film, if properly processed it will last much longer than the originals.
Advocating its use
In my opinion, it’s time CIOs examined this solution and evaluated its use in their organizations. The advantages are substantial as by freeing up space used by these documents they can save huge rentals, the method enables quick access and retrieval and it ensures their safe keeping much better than what is possible with traditional methods with physical documents.
In my next dispatch, I will explain my experiences with the use of microfilming and the benefits we got from the usage.
We always like to complain about the lack of support from the management or non-cooperation of the end users. We say we would have gone miles, had they been kind enough to us and had helped us for the cause. The lament is justified to some extent and I agree that we, as sincere professionals, need that kind of patronage.
All projects, however, do not go wrong and neither are all situations so bad. The very reason for our great going is that we all have several successful projects to our credit and we are confident of achieving many more goals. We talk of our success stories in various forums and willingly oblige magazines when they want to publish our case studies. It is easy to corner all the credit to ourselves and claim that we succeeded in spite of several roadblocks, but will we be honest in saying that no one helped us in the entire show? In case we have not adequately recognized and acknowledged the contributions made by various wings of the organization, it can be regarded as our weakness The management does its bit to approve and sanction funds for our projects; and unless the end users make use of the systems we develop, how can we ever hope to meet the objectives? We, therefore, are never alone and we have to recognize the contribution of others in our endeavors.
It will only be fair on our part to acknowledge the support that we receive from various quarters. This way, on one hand, we can complete the loop and, on the other, encourage them to lend their support for all our future projects. It is normal for people to resist change in the initial stages but we have learnt to get over this part through our experiences. If users willingly accept the technological changes on their own, it will be proper on our part to give them the credit for doing so.
There are various ways in which we can acknowledge their support, let me write down a few:
- Once we get approval for our budget or project, we can write a note to the Board/ CEO/ others, thanking them for the approval and assuring them of our full efforts to make the project(s) successful.
- It is a good practice to send periodic reports to the management giving them an update on important projects so that they feel reassured on the projects/ budgets sanctioned by them.
- On successful completion of projects, send a ‘thank you’ note to the concerned business/ functional heads.
- Let them light a lamp or cut the ribbon whenever we kick-off or launch a project, and ask them to deliver a short address.
- If they have done a good job, cite them as examples/ reference to the rest of the organization.
- Formally thank them for their role once the project has been successfully completed.
- Take them out for a dinner or an event as recognition of their contribution and support.
- Involve them during the design of the processes, drawing a road map etc. to give them a feeling of participation.
These are some of the measures that came to my mind and I am sure you would have many more brilliant ideas. The point of emphasis is that we need the support of various stakeholders for the success of our ventures and we should get them on our side through a genuine sense of understanding and appreciation of their views and feelings.
A paperless office was a dream of many an organization in the 1980s and ’90s. The simple word processor was followed by spreadsheets and other office automation software. It was soon realized that documents so created needed to be managed. So the ‘Document Management System’ (DMS) came into being and later as the World Wide Web came along, it dawned on people that the artifacts placed on the Web also needed to be managed; so the ‘Content Management System’ (CMS) followed. Now let us look at these terms and understand them.
A document management system (DMS) is a computer system (or set of computer programs) used to track and store electronic documents, and / or images of paper documents. It is usually also capable of keeping track of the different versions created by different users (history tracking). Now, the term has some overlap with the concepts of content management systems. DMS is often viewed as a component of Enterprise Content Management (ECM) systems and related to digital asset management, document imaging, workflow systems and records management systems.
In early 2001 when I put in DMS in our organization, it took us some effort to change the organization’s work culture; I had to persuade people to try out the electronic form of managing documents. When it succeeded, I was very happy and satisfied but soon realized that I had to expand my horizon of thought to consider other demands that had suddenly sprung up. Our marketing department wanted all their publicity material, advertisements in print, radio and TV, artwork and other creative material to be stored, catalogued and made shareable. That was a tall order and I had to struggle to find a solution. Later, as our website got loaded with content and our intranet started exploding with matter, it became apparent that these too needed our attention. I then got exposed to the developing area of content management.
An enterprise content management system (ECM) involves management of content, documents, details and records related to the organizational processes of an enterprise. The purpose and result is to manage the organization’s unstructured information (content), with all its diversity of format and location. The main objectives of enterprise content management are to streamline access, eliminate bottlenecks, optimize security and maintain integrity.
A CMS/ ECM provides a collection of procedures for managing work flow in a collaborative environment. The procedures are designed to do the following:
- Allow for a large number of people to contribute to and share stored data.
- Control access to the data, based on user roles (defining which information users or user groups can view, edit, publish, etc.)
- Aid in easy storage and retrieval of data.
- Reduce repetitive duplicate input.
- Improve the ease of report writing.
- Improve communication between users.
In my last organization, I had put in ECM/ CMS and included all forms of records, including normal office documents. We provided access to all content through the enterprise portal, and ensured security by centrally defining access rights accorded by the respective managers through a work flow process. Placing all important office records centrally had quite a few advantages; data could be shared in the group, security features could be enabled and data management in terms of safety, back-up etc. became much easier. People could access records at any time and from any location since it was not confined to a desktop or laptop.
It is however important that the purpose of putting in content management system in any organization should be clear and subsequent steps should proceed in the stated direction. Success of the system can be gauged by fulfillment of the objectives, such as information sharing, safety/ security of data, user convenience, etc.
These festive months are times to enjoy and rejoice. We do have fun; but there is something else to these festivities that give us the jitters. It is the greetings messages that run unhindered through our communication pipes and create those famed bottlenecks.
As a CIO, I have faced these situations often and I may have interesting stories to tell. It was in 1998 when I had just connected all offices of my organization on e-mail. In the initial period, reluctant users would send occasional mails to others only when forced to. We had then connected all offices using VSAT network with a meager bandwidth as it was expensive. But as Diwali approached, the network went numb and it took us a while to discover that it was that sudden burst of traffic (with greetings messages) that choked our network. Then came the New Year and the network started to blink again. People had by then learnt to create new cards using paint brush and other utilities; and those attachments were really heavy. Over the next two years, we took several measures to address this problem. For instance, I sent a mail to all requesting them to be choosy when sending greetings and to be measured by sending to those whom they knew rather than marking it to all. When that didn’t work, we had to block access to central groups for all except a few seniors. In order to bring a smile to those sad faces, we introduced a greeting cards application, asking people to choose cards from them instead of creating their own. We invited all the creative artists to draw out new cards with their signatures and add to the library.
The matter changed over the years as bandwidth got cheaper. With larger pipes the problem has perhaps become manageable or perhaps not quite so, as this traffic still poses a problem often. We know of the choke created on our mobile networks when people’s SMS messages flow with gusto. The mobile companies had to resort to higher tariff for such periods as a measure for controlling traffic.
The greetings conundrum does cause its own sweet trouble. Being a social activity, it makes difficult being harsh with people, and managements generally sympathize with their brethren. This is tricky; CIOs have to find a new way to address this problem. There are few tips that I can offer, though there could be other good methods adopted by some of us.
1. Advisory to users: Users sometimes need to be educated, made aware or simply told to exercise judgment. It may help sending a message to all asking them to send greetings to only those whom they know rather than marking them to all in the organization. I also used to mention of users’ complaints of unsolicited greetings messages from people not known to them.
2. Set an example with our conduct: I decided that I will not send mass messages, and will also not reply to such messages even if they come from close friends. I then persuaded employees in my department and many senior functionaries to observe such celibacy; and it worked.
3. Create a greetings library for internal usage: This reduces traffic and helps standardizing this ritual besides reducing the data traffic. Those who do not follow this practice can be talked to.
4. Greetings coming in or going out of the organization: Such incoming and outgoing messages also create a bottleneck. Though not much can be done in respect to our dealings with official contacts, we can request users to make use of birthday greetings sites, or their personal mail (yahoo/gmail etc.) for greeting their friends and other contacts.
Festivals are social events and we have to let people enjoy and greet each other. While such freedom is desirable, it makes sense to keep a watch on the computing and network infrastructure and ensure that it is available to the organization and people at large. That is the responsibility that a CIO is bestowed with, and he has to find a way to ensure that the systems function at all times.
I attended another conference last week, one of the many that dot our cities every week. Friday evenings are usually preferred for events by the vendors, media companies, and other event management organizations as that ensures better attendance. Nothing wrong with that; in fact, this keeps all the constituents happy. But let me discuss this specific conference that I attended the last.
This was a a full day conference in one of the 5-star hotels in the NCR area and it focused on ‘Data Center Strategies’. The event was to have several sessions during the day with a few of them running on parallel tracks. The topics covered included the setting up of datacenters, cabling solutions, air conditioning options, infrastructure optimization, server virtualization, cloud computing, etc. The audience consisted of CIOs from various companies in the NCR area though a few came from outside the region as well. I was invited to be a speaker in one of the sessions and so were a few other CIOs. The organizers had with them quite a few sponsors and some of them put up stalls to display their products or expertise. There was also an entertainment program set up for the evening followed by cocktails and dinner. In short, the event was planned to be a big affair and with right partners as sponsors, and they thought they had everything well worked out.
The event started off in the morning at the appointed time though the count of people was a much lower than what was planned. As the morning progressed a few others walked in but the number was still a bit on the sorry side. Some members preferred to step outside the room during the sessions to network with other fellow CIOs. The scene in the post lunch period however turned a bit healthy and the evening tea break saw the number building up to a decent scale. The exhibitors may have had a tough time as very few made their way to their stalls and the stalls exhibited a deserted look. As the sessions ended, a lot many arrived from nowhere and the people were in full flow to be a part of the entertainment show in the evening and the networking cocktails and dinner thereafter.
I am not sure if the organizers, sponsors, exhibitors, the speakers or the audience were really happy with the way the event went; however, I do not think the purpose of the conference was really served. As I set out to think about the conference and reasons for it falling short of a success, a few points emerged:
1. Did not clearly identify the target audience: The subject of datacenter strategies did not go down well with the CIOs. It was clear that most CIOs are increasingly moving towards outsourcing of their computing facilities. A lot many of them have hosted their servers externally and a few of them have already moved their applications to run as a PaaS or a SaaS service. Cloud computing model is also being seriously looked at by CIOs and so the direction is clear. In short, CIOs are moving away from fortifying their datacenters. The audience targeted therefore was not very appropriate and the seminar would have been better served if they had targeted service providers and staff of large datacenters.
2. The conference was stretched: A full day seminar was perhaps a stretch and CIOs obviously did not find it easy to take off a full day to attend. That perhaps explains why audience really swelled in the second half. I usually find the attention span of the audience wane after half a day of closed room presentation and discussions.
3. Need to understand the mood of CIOs: In all events it is important to understand the need of the audience and programs should be designed accordingly. Organizers, however, under pressure from sponsors, usually subject CIOs to long sessions including vendor presentations. Seminars nowadays happen by the dozen and CIOs attend only those where they find value; in other cases they come to network with their fellow professionals. Friday evenings are usually relaxed and a short seminar followed by entertainment and dinner is well accepted.
These were a few of my observations and I feel that such seminars would succeed if they identify the right target audience and design their programs to address their needs. Seminars these days are one too many and every seminar has to bring in something different to be able to attract audience.
With so much having been said about cloud computing and the great promise that it holds, it is but natural that companies examine it for feasibility and usefulness. While there is a tremendous vendor push on one hand, CIOs also face increased pressure from management, who favour a greater degree of outsourcing. However, progress on this front has been slow; there is a lot of smoke and less fire.
The software vendors/service providers are a major driving force pushing this solution; doing their bit to popularize it through advertisements, seminars, mass mailers and newspaper and magazine articles. However, deft handling might produce better results. I shall highlight this point using two examples:
a. Overhype: Though the voice has been heard far and wide, user perception is still hazy and the scene sure is cloudy. Cloud Computing is often touted as a solution for all ills and discouraging users. Vendors would perhaps do better by investing in creating proper awareness. When approached by vendors I have often asked them to study our set up and suggest an appropriate way forward; but the response has not been very encouraging.
b. Are the vendors ready: While the sales representatives do a good job of selling the proposition, they muddle through the next step when discussing details. Often, their tariff structures are incomplete as they haven’t considered various usage and default conditions put to them. Licensing has also been a problem since vendor policies with respect to conversion from the perpetual licensing model (as existing) to the revenue model in the cloud set up, is not clear. License fees for certain software, which are charged on the basis of CPUs used, are also a concern.
The way ahead for users
The user companies, I am sure, are accustomed to hype that gets created whenever new technologies are introduced in to the market. Over time as technologies mature, users also get wiser and slowly start evaluating, having more information at their command. Cloud computing as a solution, in my opinion, is just passing through this phase. Users are becoming more aware and getting into informed debates with vendors. However, they will do well to consider the following:
a. Evaluate and deploy the most appropriate solution: Cloud computing is here to stay. It is however important that we do not jump on to the bandwagon without adequate analysis and a proper assessment of organizational needs. Depending on the current IT landscape and the quality of solutions presented by the vendors, CIOs may find it more appropriate to move on to ‘platform as a service’ (PaaS) or ‘software as a service’ (SaaS) first and then move to the cloud at a later stage. Users should exercise their judgment and not get carried away.
b. Go ahead but exercise caution where necessary: Users often get stuck not knowing where to start. They say they have servers/storage etc. which are still functional and therefore an impediment to moving the applications that run on them. However, all resources will never become obsolete at once, and we will always have machines of different vintage. It is sometimes better to move a new application to the cloud or move a current application which suffers a bottleneck rather than making fresh investments. It is the first step that matters. And if that works, the further march gets so much easier.
In short, ‘cloud computing’ is an interesting journey interspersed with the usual roadblocks and challenges. Adequate planning and preparation however makes the journey easier and fruitful. Where there is a will, there is a way.
Having dealt with the basics of what cloud computing is, let us go further to understand more on this subject and talk about the main characteristics and deployment models. With so much of hype surrounding the topic, simple matters often get missed out, leaving us a bit skeptical of its utility. Many therefore stay unsure, wondering, whether this solution is appropriate for them or whether this is the right time for them to do so.
We can look at the advantages of cloud computing and examine if this would benefit us. Cloud computing exhibits the following key characteristics:
1. Reduction in costs: The cloud model (especially the public cloud) works on a shared delivery model, which enables sharing of resources and costs across a large pool of users and therefore brings down costs. Instead of companies making capital investments, they incur operational expenses. This lowers barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing is on a utility computing basis and the user is charged only for the resources used.
2. Agility: Since resources are provided ‘on demand’, it brings in agility, thereby, improving users’ ability to re-provision technological / infrastructure resources. Therefore, it ensures scalability and elasticity in provisioning resources on a self-service and near real-time basis.
3. Device and location independence: Users are able to access systems using a web browser, regardless of their location or what device they are using (e.g., PC, mobile phone). As infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, users can connect from anywhere.
4. Peak-load capacity management: In normal cases, we deploy infrastructure to meet peak load demands, and which lays idle at other times. This problem is taken care of by provision resources on demand. There are utilization and efficiency improvements for systems that are often only 10–20% utilized.
5. Reliability: Service levels are assured through the SLAs signed and the service provider usually provides for multiple redundant sites, which makes well-designed cloud computing architecture suitable for business continuity and disaster recovery.
6. Performance monitoring: SLAs should cover performance monitoring services that the partner must provide. The user company can thus be relieved of this responsibility.
7. Security: The subject of security is often discussed and there are huge concerns especially on the public cloud model. In my opinion, security could improve due to centralization of data, increased security-focused resources, etc. Nonetheless, concerns can persist about loss of control over certain sensitive data, and the lack of security when on shared platforms.
Various models of deployment are possible and organizations have been using them for a while. Let us discuss two of them which are usually debated.
The public cloud: A public cloud is one based on the standard cloud computing model, in which a service provider makes the resources, such as applications and storage, available to the general public, over the Internet. Public cloud services may be offered on a pay-per-usage model. Since this is based on a shared services model, it really helps in bringing down costs.
The private cloud: Private cloud is infrastructure operated solely for a single organization, whether managed internally or by a third-party and hosted internally or externally. It is the so-called private cloud, where companies, in effect, try ‘cloud computing at home’ instead of turning to an Internet-based service. The idea is that you get all the scalability, metering, and time-to-market benefits of a public cloud service without ceding control, security, and recurring costs to a service provider.
The private cloud model has, however, attracted criticism because users ‘still have to buy, build, and manage them’ and thus do not benefit from lower up-front capital costs and less hands-on management. Adoption of the public cloud is largely influenced by concerns of security and I am sure that as the concept matures and the security issues are addressed, larger number of organizations will be seen on the public cloud.