Meta teams up with Microsoft, Qualcomm for latest AI model Llama 2
Facebook parent Meta on Wednesday released the commercial version of its open-source AI model Llama. Dubbed as Llama 2, the new version of its AI model will be distributed by Microsoft through its Azure cloud service and will run on the Windows operating system. The tie-up between Microsoft and Meta is targeted at enterprises, offering them the capability to build apps using generative AI tools.
In addition, Meta is working with Qualcomm to integrate Llama 2 AI implementations into smartphones and PCs starting next year.
The company made these announcements as part of Microsoft's Inspire 2023 event, where Meta CEO Mark Zuckerberg said that the AI model, which was “previously exclusively offered to a limited number of academics for research endeavours, will now be accessible through direct downloads”.
At its Inspire conference, while Microsoft called Meta as a "preferred" partner, it said that the model will be made available through other platforms, including Amazon Web Services (AWS), Microsoft's main cloud rival, as well as AI startup Hugging Face and others.
Large language models are essential in powering generative AI chatbots like OpenAI’s ChatGPT and Google’s Bard. Microsoft, which recently introduced an AI-powered Bing search, has incorporated ChatGPT into its platform. The tech firm is also a major funder and partner of OpenAI. But neither ChatGPT nor similar offerings from Microsoft or Google are open-source.
“We believe an open approach is the right one for the development of today’s AI models, especially those in the generative AI space where the technology is rapidly advancing,” Meta said in a statement.
“Opening access to today’s AI models means a generation of developers and researchers can stress test them, identifying and solving problems fast, as a community.” Zuckerberg said in a Facebook post on Tuesday, adding that such models also "improves safety and security because when software is open, more people can scrutinize it to identify and fix potential issues."
Zuckerberg also pointed to Meta's history of open-sourcing its AI work, such as with its development of the widely used machine-learning framework PyTorch.
Meta first announced its LLaMA model in February, and said that it received over 100,000 requests from researchers to use its first model, but the open-source LLaMA 2 will likely have a far bigger reach because the latest version was trained on 40% more data than LLaMa 1.
The company also said that LLaMa 2 outperforms other LLMs like Falcon and MPT when it comes to knowledge tests, reasoning, proficiency and coding.
Additionally, Meta is working with Qualcomm to integrate Llama 2 AI implementations into smartphones and PCs starting 2024 for AI-powered apps that do not rely on cloud services for its functioning. This partnership aims to enable customers, partners, and developers to build a range of use cases, including intelligent virtual assistants, productivity applications, content creation tools, and entertainment experiences.
Durga Malladi, senior vice president and general manager of technology, planning and edge solutions businesses, Qualcomm Technologies, said, “To effectively scale generative AI into the mainstream, AI will need to run on both the cloud and devices at the edge, such as smartphones, laptops, vehicles, and IoT devices.”
Powered by Qualcomm’s Snapdragon 8 Gen 2 processor, these on-device AI experiences can function even in areas with no connectivity or in airplane mode. The processor is currently powering the Samsung Galaxy S23 series and is also embedded in various other phone and computer brands.
Meta and Qualcomm Technologies have a longstanding history of working together to drive technology innovation and deliver the next generation of premium device experiences. This collaboration with Meta will further expand the reach of Llama 2 AI technology, Qualcomm said.