By continuing to browse our site you agree to our use of cookies, revised Privacy Policy and Terms of Use. You can change your cookie settings through your browser.
SITEMAP
Copyright © 2024 CGTN. 京ICP备20000184号
Disinformation report hotline: 010-85061466
SITEMAP
Copyright © 2024 CGTN. 京ICP备20000184号
Disinformation report hotline: 010-85061466
An illustration shows a 3D printed Google logo placed on the Apple Macbook. /Reuters
Apple CEO Tim Cook's Apple announced a splashy deal with OpenAI on Monday to include its powerful artificial intelligence (AI) model as a part of its voice assistant, Siri. However, in the fine print of a technical document Apple published after the event, the company makes clear that Alphabet's Google has emerged as another winner in the company's quest to catch up in AI.
To build Apple's foundation AI models, the company's engineers used its own framework software with a range of hardware, specifically its own on-premise graphics processing units (GPUs) and chips available only on Google's cloud called tensor processing units (TPUs).
Google has been building TPUs for roughly 10 years, and has publicly discussed two flavors of its fifth-generation chips that can be used for AI training; the performance version of the fifth generation offers performance competitive with Nvidia H100 AI chips, Google said.
Google announced at its annual developer conference that a sixth generation will launch this year.
The processors are designed specifically to run AI applications and train models, and Google has built a cloud computing hardware and software platform around them.
Apple and Google did not immediately return requests for comment. Apple did not discuss the extent to which it relied on Google's chips and software compared with hardware from Nvidia or other AI vendors.
But using Google's chips typically requires a client to purchase access to them through its cloud division, much in the same way customers buy computing time from Amazon.com's AWS or Microsoft's Azure.