CLOUD COMPUTING 101
Cloud computing is an environment in which the user can gain access on an as needed basis to virtualized and dynamically scalable computing resources. Said another way, Cloud computing is Internet-based computing, whereby shared resources, software, and information are provided to computers and other devices on demand, like the electricity grid.
Cloud computing is a paradigm shift following the shift from mainframe to client–server in the early 1980s. Details are abstracted from the users, who no longer have need for expertise in, or control over, the technology infrastructure “in the cloud” that supports them.
In general, cloud computing customers do not own the physical infrastructure, instead avoiding capital expenditure by renting usage from a third-party provider. They consume resources as a service and pay only for resources that they use. Most cloud-computing offerings employ the utility computing model, which is analogous to how traditional utility services, such as electricity, are consumed. Sharing “perishable and intangible” computing power among multiple tenants can improve utilization rates, as servers are not unnecessarily left idle, reducing costs significantly while increasing the speed of application development.
- Pay for and consume the IT infrastructure as and when you need it
- Change capital expenditures to operating expenses
- Lower total cost of ownership of your IT infrastructure
- Increase the scalability of your infrastructure
- Decrease your time to market
- Reallocate staff to focus on value added activities rather than routine operations
- Increase IT capabilities and agility as business needs change