Cloud Computing and the Rules of Machines in the 21st Century
By Dr.Hossein Eslambolchi, CEO, CyberFlow Analytics
Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. Given the massive power of cloud and quantum computing, the use of robotics, much like the Terminator, will ultimately take over 75% of our daily activities by the year 2025.
Cloud computing in this decade is becoming more social, political and financial. As such, one needs to pay significant attention to this technology in order to be able to optimize enterprises across the globe in different vertical industries.
“We will be dealing with massive amounts of data to manage as storage for the future and analytics including linear programming for this network and data become imminent in this decade”
Cloud growth is one to be reckoned with in this decade. I predict global traffic flow will grow from 130 Exabytes in 2011 to over 1.6 zettabytes annually by 2015. In addition, with video streaming and how content providers like ESPN and others stream their content, I would expect to have over 22 trillion hours of streaming video by 2014, of which 5 Trillion hours will be for business web conferencing with webcam alone. It is now also clear that with the advent of 3D TVs and content made available in 3D format that uses even more bandwidth, over 1.6 Trillion hours of online HD/3D video streaming will be available and sent over the Internet by 2014.
Along with growth in IP traffic, one needs to look at the power of cloud as it relates to data centers and its traffic carrying load. It is expected that data center traffic will grow by 33% annually from 4.8 zettabytes in 2011, with nothing stopping this growth due to the number of applications and innovations needing server infrastructure and more bandwidth driven applications. I also believe 76% of IP data will stay within data center as virtual machines migrate from one server to another server; the use of cloud computing in this world becomes paramount in designing the next generation data centers. For example, today Netflix represents 33% of U.S. peak downstream traffic and total video makes up 32.6% of peak downstream mobile traffic, of which YouTube is the largest contributor, and more will come with other content such as live sports.
In general, we will be dealing with massive amounts of data to manage as storage for the future and analytics including linear programming for this network and data become imminent in this decade. Essentially, knowledge mining will truly drive new businesses and solutions for all vertical solutions by the use of analytics and deep packet information about the content.
It is expected that the total size of cloud will be over $40B in 2012, growing to over $100B in 2015. The power of this growth is that the cloud unleashes a power of utility computing that has never been seen before, becoming a game changer when matched with advanced communications and applications.
Rules of the Machines & Cloud
If we look at the 1940s, we used to do computations at a rate of 1 per 150 seconds and IBM SSEC was giving us around 500 cycles per second (CPS). Forty years later, in 1980, we had improved computational power using IBM’s PC to over 250K CPS and the Cray Computer was running at 86M CPS for many scientific and heavy analytics computational needs. We improved this model to where, in 1980, with IBM PS/2, we could do 13M CPS and in 2011, had advanced even further by developing the IPAD by Apple for 1.7B CPS, Earth Simulator at 38T CPS and Sony PlayStation at 2.1 Trillion CPS. I predict with the way atoms are being manipulated to below 10 nanometers, we could see a computation computer that can do 8.6 quadrillion CPS in the near future, and when quantum computing shows up sometime in the next 6-8 years, the amount of computation will be so massive that driving applications of the future like robotics will become rather easy to implement in real life. In essence, we will become a robotic world by 2020 in addition to being IP and virtual, changing the notion of reality for everyone across the planet within the next decade. That is a high level prediction but I see it in my vision and mind very easily with the advent and improvements in computing power -- not including the years. Computing is set to surpass Moore’s law by a wide margin, relegating Moore’s law to yet another disproven theory for the history books.
What else is possible with such a massive amount of computing power? Consider today, we can see entire galaxies thanks to massive investments in telescopes, whereas only 10M light years away from Earth were visible just 60 years prior. We are now able to see around 10B light years away, which is considered to be most far-reaching distance object in the Hubble Deep field. I predict with gamma ray technology and the computational power of quantum, we could be seeing over 12B light years away and finally be able to see if there is other life outside of our solar system. Computation will be so fast that one could derive intelligence very quickly with massive data and quantum computing in the cloud. This is why cloud is becoming the de facto standard of the 21st century for all enterprises and service providers across the planet.
There are some key issues that service providers must address in order to ride this massive transformational change that is coming to everyone and every business and to not miss this $100billion market by 2015. Let me share with you some of the important ones in order of priority as part of my new top 10:
1. Security & Availability
2. Unified Communications
3. Enterprise Applications
4. Cloud Orchestration
5. Cloud-Based Services
6. Data Analytics
7. Converged Services
8. Infrastructure Value
9. Platform Virtualization
10. Network Virtualization