Apple researchers show how the company plans on running AI models on-device
Apple will be a late entrant into the artificial intelligence space when it reveals its next generation operating systems for iPhones, iPads and Macs at its Worldwide Developers Conference (WWDC) on June 10. Bloomberg has reported that Apple is developing its own large language model (LLM) to power on-device generative AI features. But is it possible to run an entire AI model without any cloud based processing? Apple researchers think it is.
Apple’s on-device AI
In a research paper titled “LLM in a flash: Efficient Large Language Model Inference with Limited Memory”, researchers from Apple noted how the company plans to run big AI models on the iPhones, iPads, Macbooks, and more.