US Nvidia AI Chip Export Ban, AI Data Poisoning Tool, and More: 5 AI Developments You May Have Missed
Today, October 25, marked a significant day in the realm of artificial intelligence, particularly concerning AI chips. Firstly, Nvidia, a major tech company, has been instructed by the US government to halt the exportation of certain advanced AI chips to China immediately. This directive was initially scheduled to take effect 30 days after October 17. Additionally, Qualcomm has introduced a new AI-driven chip for Microsoft Windows laptops, asserting that its capabilities may even surpass those of Apple’s Mac computers. These developments and more are covered in today’s AI roundup, so let’s delve deeper into the details.
The US prevents Nvidia from exporting artificial intelligence chips to China
According to a report by The Guardian, Nvidia has revealed that due to regulatory changes, the US government has instructed it to immediately stop exporting certain high-end AI chips to China. Those restrictions, originally set to take effect 30 days after the Biden administration’s Oct. 17 announcement, were part of measures aimed at preventing countries like China, Iran and Russia from acquiring advanced artificial intelligence chips developed by Nvidia and other companies. Nvidia didn’t give a specific reason for the accelerated timeline, but said it doesn’t expect the move to have a significant impact on its earnings immediately.
Qualcomm unveils AI chip for Windows PCs
Qualcomm revealed details of a chip designed for Microsoft Windows-based laptops, according to a Reuters report. The AI chips are due to be introduced in 2024, and the company claims they will surpass Apple’s Mac chips in certain tasks.
According to Qualcomm executives, the upcoming Snapdragon Elite X chip has been redesigned to improve its performance in AI-related tasks such as email summarization, text generation, and image creation.
These AI capabilities are not limited to laptops. Qualcomm plans to include them in its smartphone chips as well. Google and Meta have both announced plans to harness these features on their respective smartphone platforms.
Technology companies demand AI security standards
According to a Financial Times report, Microsoft, OpenAI, Google and Anthropic have worked together to create AI security standards. They have appointed the leader of their alliance, which aims to fix what they say is a “gap” in global AI regulation.
The four tech giants, which merged earlier this summer to create the Frontier Model Forum, have tapped Chris Meserole of the Brookings Institution as the group’s executive director. In addition, the forum has revealed a plan to allocate $10 million to an artificial intelligence security fund.
The IWF issues a warning about child abuse images created by artificial intelligence
The Internet Watch Foundation (IWF) is actively involved in removing images of child sexual abuse from websites, reports the BBC. They have identified thousands of AI-generated images that are so realistic they break UK law.
“The worst nightmares have come true,” said Susie Hargreaves OBE, chief executive of the Cambridge-based IWF. “What is disturbing is that criminals are deliberately training their AI with images of real victims. Children who have been raped in the past are now being included in new scenarios because someone, somewhere wants to see it,” he added.
The surfaces of data poisoning tools can corrupt artificial intelligence models that create images
The newly developed Nightshade tool allows users to integrate it with their digital intellectual property and effectively manipulate training data with art, as reported by The Verge. Over time, it can disrupt and degrade the performance of AI art platforms such as DALL-E, Stable Diffusion, and Midjourney, rendering them unable to create images.
Nightshade brings subtle changes to digital art pixels. When this manipulated artwork is used in model training, the “poison” exploits security holes, resulting in model confusion. As a result, artificial intelligence no longer recognizes a picture of a house as a car and may, for example, misinterpret it as a boat.