Apple recently introduced a beta version of Apple Intelligence, a new set of AI-powered features, along with developer betas of iOS 18.1, iPadOS 18.1, and macOS Sequoia. This move showcases Apple's efforts to enhance its AI capabilities across its devices. One surprising aspect of this launch is Apple's partnership with Google for chips used in training their AI models. Instead of using their own chips, Apple opted for Google's Tensor Processing Units (TPUs), highlighting an interesting collaboration in the tech industry.
A recent publication, "Apple Intelligence Foundation Language Models," explains the technical side of Apple Intelligence. This document is aimed at researchers and reveals that Apple's foundational AI models, AFM-on-device and AFM-server, were trained with Google’s TPUs. The AFM-on-device model, meant for iPhones, used 2,048 TPUv5p chips, while the larger AFM-server model, used for more complex tasks, was trained with 8,192 TPUv4 chips.
Apple's AI suite includes various models such as a coding model to improve Xcode and a diffusion model for visual expression, like in the Messages app. Despite speculation that Apple might use Nvidia’s chips due to the competitive AI landscape, they chose Google's technology.
At the 2024 Worldwide Developers Conference (WWDC), Apple showcased Apple Intelligence as a key feature in upcoming software updates. The AI aims to improve user experience by helping with tasks like drafting messages, creating visuals, and automating routine activities.
Currently, Apple Intelligence is available in the developer beta versions for iPhone 15 Pro and Pro Max, and newer iPads and Macs. To access it, users need to join a waitlist after updating their devices. The beta version includes new features like a revamped Siri interface, better command recognition, and improvements in photo search, movie creation, and text generation. Some features will be rolled out throughout the year.