CLOUD COMPUTING 101

Cloud computing is an environment in which the user can gain access on an as needed basis to virtualized and dynamically scalable computing resources.  Said another way, Cloud computing is Internet-based computing, whereby shared resources, software, and information are provided to computers and other devices on demand, like the electricity grid.

Cloud computing is a paradigm shift following the shift from mainframe to client–server in the early 1980s. Details are abstracted from the users, who no longer have need for expertise in, or control over, the technology infrastructure “in the cloud” that supports them.

In general, cloud computing customers do not own the physical infrastructure, instead avoiding capital expenditure by renting usage from a third-party provider. They consume resources as a service and pay only for resources that they use. Most cloud-computing offerings employ the utility computing model, which is analogous to how traditional utility services, such as electricity, are consumed. Sharing “perishable and intangible” computing power among multiple tenants can improve utilization rates, as servers are not unnecessarily left idle, reducing costs significantly while increasing the speed of application development.

Benefits-at-a-Glance

  1. Pay for and consume the IT infrastructure as and when you need it
  2. Change capital expenditures to operating expenses
  3. Lower total cost of ownership of your IT infrastructure
  4. Increase the scalability of your infrastructure
  5. Decrease your time to market
  6. Reallocate staff to focus on value added activities rather than routine operations
  7. Increase IT capabilities and agility as business needs change