It’s been a long time since tech’s biggest companies could be sorted into discrete buckets according to the products they pioneered—-Google, the search software giant; Apple, the computer and mobile device innovator; e-commerce leader Amazon; business software stalwart Microsoft; and social media engine Facebook.
Since then, these major innovators have built on a common resource—increasingly powerful data processing chips—to branch out into a host of new consumer and business areas. These include Web-based data storage and computing; Web-connected virtual assistants that talk; music and video streaming; self-driving cars; virtual reality; and augmented reality.
A diagram of the current competitive relationships among these giants might look like a tangled plate of spaghetti, or like overlapping Venn diagrams. For example, Google competes with Microsoft and Amazon in cloud hosting; Apple’s talking virtual assistant competes with Amazon’s and Google’s; and Facebook competes with Amazon and Apple in video streaming.
Now, some of these tech giants are expanding outside their own competitive circles. Over the last year, they’ve been moving significantly into the turf of companies that have long provided the foundation of their activities—chipmakers such as Intel and Nvidia.
Chips designed for A.I.
The trigger for this movement is artificial intelligence and the demanding tasks tackled by devices that use A.I. software, as Wired has observed. Virtual assistants that can understand human speech, and cars that can “see” the difference between a highway median line and a roadway curb, require advanced processors organized something like a human brain, using neural networks.
Google and Microsoft have developed their own chips designed to support the A.I. capabilities of their newest products, such as Microsoft’s HoloLens, a “mixed reality” headset that brings holograms to life in the real environments of users, who can interact with them. And Apple may have quietly entered the A.I. chip race behind the scenes, as Bloomberg and other news outlets speculate.
In response, established chipmakers, including Nvidia and Intel, are pressing forward with their own processor innovations, to prevent the tech titans that have disrupted so many industries from rolling through their own. Nvidia is defending the dominance it gained as chipmaker for this early artificial intelligence era when its powerful graphics processing units were adapted for A.I. In May, the company unveiled the Tesla V100, designed for speedier work in a type of A.I. data processing called deep learning.
Established chipmakers are also collaborating with big tech companies themselves. Nvidia has announced several collaborations with Amazon, and San Diego-based Qualcomm is reportedly working together with Facebook as Qualcomm develops chips suited for machine learning.
Meanwhile, startup inventors such as Palo Alto, CA-based Groq, Redwood City, CA-based Mythic; Campbell, CA-based Wave Computing; and U.K.-based Graphcore are filling out the field in the A.I. chip horse race. Some startups have already been snapped up in acquisitions.
For example, Intel spent $16.7 billion to acquire San Jose, CA-based Altera in 2015, and last year acquired San Diego-based Nervana Systems. Intel, a leading supplier of data center processors, bought the companies in a drive to develop faster, programmable chips that can support tasks such as the training of neural networks.
A.I. chips boost both cloud and edge computing
Google rolled out its A.I.-adapted Tensor Processing Unit, or TPU, in mid-2016, and announced a second generation, Cloud TPU, in May. The chips are already in use at its data centers, and could help Google entice customers to try its Web-based computing business, rather than sticking with the much-used Amazon Web Services or other cloud computing hosts.
But the new generation of sophisticated processors could also shift many computing chores away from rented cloud servers, and back to user-owned devices—reversing a trend over the past decade.
Big businesses and startups alike have been turning to outside Web-based computing services to amplify the processing power of their own machines, to avoid buying their own servers, and to take advantage of other benefits, such as automatic updates of Web-based software sold as a subscription. But running computing tasks through distant data centers doesn’t always work well with emerging technologies such as virtual reality.
Even a slight lag time in response during a virtual reality game can make a VR goggle-wearing player dizzy or nauseous. Players who turn their heads expect the landscape to change in time with their movements. Instead of relying on cloud computing to speed up the response, tech companies want to equip VR headsets with powerful A.I. chips that can analyze the player’s movements on the spot, and instantaneously adjust the scene to the player’s new field of view.
Similarly, self driving cars could be dangerous if their navigational powers depended on flickering Internet connections to cloud servers. But if powerful A.I. chips were installed to empower a device to analyze the landscape, cars, trucks, drones, and other vehicles could exercise “autonomous intelligence.”
The impulse to shift high-level data analysis from central data centers to the scene of the action is not confined to frontier consumer products. It’s also seen as a practice that could help fulfill the promise of the industrial Internet of Things.
GE division GE Digital says the increasing analytical power of computing devices has opened the door to “edge computing”—locating data analysis close to where data is being collected from industrial machines such as wind turbines, remote sensors, and equipment control mechanisms. Many such machines are not consistently or reliably connected to the Web, the GE unit says. Web connections also provide an attack surface for hackers, so more local data analysis may strengthen the security of certain operations.
The GE unit envisions a mixed use of cloud and edge computing, to make the most of both options. But quick, localized analysis of machine output could allow for “insightful and intelligent actions right at the edge,” GE Digital says.