Shocking Collaboration: Google’s Surprising Role in Apple’s AI Training

While Apple is indeed using OpenAI’s ChatGPT for some of its functionalities, namely its chatbot services, most of it, has been developed in-house by Apple. However, in order to do that, Apple has to rely on Google for one major thing
read more

Unless one has been living under a rock all this while, it would have been impossible to miss Apple’s recent announcement of a significant partnership with OpenAI to integrate its advanced AI model into Siri, and use ChatGPT in some of its Generative AI applications.

This marks a major milestone in the company’s AI endeavors. However, a deeper look into the technical documentation released by Apple reveals another key player in this advancement: Alphabet’s Google.

Apple’s engineers had to use a combination of Google’s proprietary framework software and a variety of hardware to develop their own foundational AI models. Notably, they have had to rely heavily on Google’s tensor processing units (TPUs) which are available only through Google’s cloud services.

These TPUs, which Google has been refining for nearly a decade now, are highly specialized chips, designed specifically to handle the intensive computational tasks required for AI training.

The fifth generation of these chips, which comes in different versions, boasts performance capabilities that are competitive with NVIDIA state-of-the-art H100 AI chips, and are much easier to access, as long as one is willing to work with Google’s cloud services.

Back in May, at Google I/O, its annual developer conference, Google announced the forthcoming release of its sixth-generation TPUs, which promises even greater performance. These processors are integral to Google’s cloud computing platform, which offers specialized hardware and software to support various AI applications.

While Apple did not specify the extent to which it used Google’s TPUs compared to other solutions like those from NVIDIA, it is clear that Google’s technology plays a significant role in Apple’s AI development.

Using Google’s chips typically means purchasing access to Google Cloud. This is a model that is followed by most other companies as well, including Amazon’s Amazon Web Services or Microsoft’s Azure.

As Apple continues to push forward in its development of Siri and other AI-driven features, the underlying support from Google’s cutting-edge TPUs exemplifies how industry leaders often depend on each other’s innovations to drive progress.

Leave a Comment

Your email address will not be published. Required fields are marked *