Tech News

AI consumes a lot of energy. Hackers can cause more consumption.

[ad_1]

Attack: But if you change the input that this type of neural network means, such as the image it feeds, you can change how much computation it takes to fix it. This opens up a vulnerability that hackers can exploit, according to researchers at the Maryland Cybersecurity Center. International Conference on Delegations of Studies this week. By adding a small amount of noise to network inputs, they detect that inputs are more difficult and increase the calculation.

When the attacker was assumed to have complete information about the neural network, they were able to maximize energy consumption. When the attackers assumed that there was no limited information, they were still able to slow down network processing and increase energy use by between 20% and 80%. The reason, the researchers found, is because the attacks are well transferred to different types of neural networks. Designing an attack on an image classification system is enough to upset many, says doctoral student Yiğitcan Kaya and the paper’s author.

Note: This type of attack is theoretical. Input adaptation architectures are not yet used in real-world applications. Researchers believe that this will change rapidly from industry pressures to expand lightweight neural networks, such as for smart homes and IoT devices. Professor Tudor Dumitraş, who advised the research, says more work needs to be done to understand the extent to which this type of threat can cause harm. But, he added, this article is a first step in raising awareness: “What is important to me is to attract people’s attention because it is a new model of threat and attacks of this kind can be carried out.”

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button