Qualcomm (QCOM) chips will support Meta (META ) Llama 2 Large Language Model (LLM), in a move that will bring LLMs – which typically need data centers to operate due to massive computational demands – to smartphones and PCs from 2024.
- Qualcomm has announced a partnership to run Meta Llama 2 on its chips from 2024.
- Meta's Llama 2 grand language model will run on Qualcomm chips for mobile devices and devices. PC.
- Large language models typically need large data centers to operate due to massive compute requirements.
- The tie-up could give Qualcomm an AI-related boost , the opponent to Nvidia.
What does this mean for Qualcomm?
In a big week for Meta's LLM challenger for the popular ChatGPT app, the two companies said Llama 2 will run on Qualcomm chips for mobile devices and PCs from 2024. The deal could be a breakthrough for the chipmaker as it pits the company against Nvidia (NVDA), which has seen its stock soar to a $1 trillion market capitalization this year.
The first generation of LLM runs usually on servers in large data centers, such as those owned by Nvidia, due to their requirement for large computing power and the company's GPU processors. This year's bull run in AI stocks has seen companies such as Qualcomm miss due to its focus on mobile devices and PCs.
Qualcomm shares rose about 11% this year, far behind the Nasdaq 100's nearly 45% gain.
What does this mean for Meta?
News of the merger came as Microsoft (MSFT ) also announced that it will support Llama 2 software in Azure and Windows. Healthcare provider Teladoc (TDOC) was another company looking to embrace AI, using the tech giant's GPT4 technology to record patient visits.
Ads Could Catapult LLM from Meta into the mainstream and also democratize the use of AI for business. Meta previously released its “weights” code for Llama, while OpenAI's GPT-4 and Google's Bard (GOOGL) are still closed-source.
Do you have any topical advice for Investopedia reporters? Please email us at firstname.lastname@example.org