Google takes the cloud to a new level
You may have heard the axiom “software is eating the world.” It’s the idea that digital efforts (i.e. streaming movies) are replacing physical things (i.e. movies in the theater).
Google is kicking it up a notch. It has a plan that will replace new costly data centers — also known as “the cloud” — with artificially intelligent software. It’s a big, audacious idea, with far-reaching implications.
Data centers make the cloud revolution possible. They are a key link in a convoluted process that involves hundreds of miles of fiber-optic cable, switches, servers and mission-critical processes. It’s not something every corporation can do. It’s also the foundation of Google’s Internet empire.
Growth, and the prospect of exponentially more to come, pushed the company to innovate.
Six years ago, Google had the bright idea that voice was the next big thing in computing. Its solution was artificially intelligent voice actions deployed across its Android mobile platform, used by 1 billion people. Those neural networks required massive amounts of data. They promised to quickly overwhelm the network. Engineers figured a 2-times expansion in data centers was required.
|Google has just escalated the war over your Internet time by super-charging its cloud centers with new technology.|
The chipset is 30- to 80-times more efficient, runs AI workloads up to 30-times faster than conventional GPUs and CPUs, and requires a “surprisingly small amount of code: just 100 to 1,500 lines,” writes Norm Jouppi, a project engineer at Google.
TPU has been operational since 2014 and it now powers every AI facet of the Google enterprise. This includes everything from Google Photos and Image Search, to DeepMind, the software that mastered the ancient Oriental strategy game called Go.
It also allowed its parent Alphabet, to forgo building at least 12 costly new data centers.
Google is not the only company building its own data center hardware. Amazon (AMZN), Microsoft (MSFT) and Facebook (FB) all furnish their data centers with custom-made switches, servers and chipsets. The difference is TPU is specifically designing processors to speed up AI. It’s heading off the data deluge at the source, the hungry code.
This is a shot across the bow at Nvidia (NVDA) and Intel (INTC), the leading makers of GPU and CPU processors. It will not displace those technologies. Recently IBM (IBM) announced it would outfit its data centers with processors and software developed by Nvidia. And Intel, the undisputed leader in data center CPUs, is upping its game with new architectures, too.
Google still operates 15 massive data centers on four continents. Despite the cost savings from TPU, it still plans to spend a whopping $10 billion building new locations all over the world in 2017.
The takeaway from TPU is possibility. Twenty years ago, sending a record, book or movie to a friend meant buying it at a physical store, then shipping it by mail. Today you just click on an image at Amazon.com and send a digital file. The abstraction is instantaneous, cost-effective.
Google found a way to abstract data centers. Think about that for a moment.
If technology makes it possible to abstract physical things involving billions of dollars of capital investment, think about what is possible. Think about the inevitable lucrative new business models. But also think about the business that will be potentially lost by data-center-equipment makers and server-colocation real-estate firms.
Software really is eating the world. It’s a terrific time to be an adaptive, informed, patient investor.