Qualcomm inserts GenAI into smartphones at industry's mega tradeshow

Just what Android fans were missing, amirite? A 7 billion parameter LLM that accepts image and voice prompts?

MWC Qualcomm is going big on AI at MWC, where it's showing off a 7 billion parameter large language model running on an Android phone, along with an online hub to help mobile devs blend models into their apps, and AI infused into its latest 5G modem and Wi-Fi 7 silicon.

The chips and telecoms biz says the grand plan is to bring "intelligent computing" to all kinds of devices at the edge, meaning PCs, cars and IoT devices, as well as the phones and Wi-Fi access points it is more closely associated with.

When any company mentions "intelligent computing," you know AI is on the agenda. Qualcomm claims it has already seeded the market with AI-capable hardware in the shape of its Snapdragon 8 Gen 3 platform for phones and Snapdragon X Elite for laptops, both announced last year with built-in neural processing units (NPUs), and now it is time to start using those features.

At MWC, the company is demonstrating Large Language and Vision Assistant (LLaVA), a large language model claimed to have over 7 billion parameters, running on an Android smartphone.

This model can accept not just text, but also images and voice as prompts. One of the demos involves feeding it pictures of different ingredients, having it produce a recipe for those ingredients and estimate how many calories are in the resulting meal.

"All this engagement is 100 percent running on device," said Qualcomm product marketing director Ignacio Contreras, who claimed it is very responsive, able to keep up with several words per second of speech input. The key benefit of running an LLM on the device like this is privacy, according to Qualcomm, as your data isn’t being uploaded to the cloud for processing.

Qualcomm also unveiled its AI Hub, a central location for developers to access resources for AI applications to run on Snapdragon hardware. The main feature of this is a library of optimized models quantized and tested to perform well on this hardware, with more than 75 available at launch.

Developers can choose the kind of model they are seeking, a framework reference such as TensorFlow or pytorch, and a target platform. This might be something like a Samsung Galaxy S23 and not just the latest hardware, according to Contreras, and the hub will guide the developer towards the right model. "In a few lines of code, the developer will be able to integrate these optimized models into the workflow and take advantage of on-device AI capabilities running on a Qualcomm platform," he said.

Also being demonstrated at MWC is the GIMP open-source graphics tool with a Stable Diffusion plugin running on a Snapdragon X Elite laptop alongside an x86 laptop, to show the benefit of having hardware built for generative AI. The Snapdragon box is claimed by Qualcomm to be three times faster at image generation than the system without an NPU.

New 5G modem and Wi-Fi access point silicon were introduced at MWC too, both claimed by Qualcomm to have AI-enhanced capabilities.

The Snapdragon X80 5G Modem-RF System was designed to support mobile networks with the latest 5G Advanced capabilities. It supports six-carrier downlink aggregation for sub-6GHz bands to offer faster speeds, and can support six antennas for better reception.

AI capabilities come via the chipset's dedicated AI tensor accelerator that is used to optimize the use of multiple antennas in a smartphone to improve the signal quality and therefore the data rates, and to increase energy efficiency.

This is also the first Snapdragon wireless chipset to integrate NB-NTN, the official standard support for non-terrestrial networks, otherwise known as satellite connectivity.

Also getting the AI treatment is the FastConnect 7900 mobile connectivity system, where it is similarly being used to boost performance and increase energy efficiency.

The primary use here is to understand what the Wi-Fi connection is being used for, such as watching videos, listening to music, or online meeting calls, and optimize the connection.

"The AI within FastConnect 7900 can understand what you are doing, and based on that optimize things like latency and power. We have tested these AI capabilities in some popular applications, and we have seen power reductions up to 30 percent compared to without the use of AI, and we can better adjust many of the Wi Fi parameters to provide the best user experience," Contreras said.

FastConnect 7900 supports Wi-Fi 7, Wi-Fi 6E and Wi-Fi 6 for peak speeds of up to 5.8 Gbps, and also integrates Bluetooth and ultra wideband (UWB) in a single chip. It is also claimed to use less power than previous generations. ®

More about


Send us news

Other stories you might like