Tech News

Europe is pushing for a ban on biometric surveillance

[ad_1]

It’s your body gold data mine. From the way you look and feel to the way you look, companies working in the growing biometrics industry are developing new and exciting ways to keep track of everything we do. And in many cases, you may not even know that you are being tracked.

But the biometrics business is on the verge of colliding with leading European data protection experts. Both the European Data Protection Supervisor, which acts as an independent EU data body, and the European Data Protection Supervisor, which helps countries implement the GDPR in a consistent manner, have called for a complete ban on the use of AI to automatically recognize people.

“Spreading remote biometric identification in spaces accessible to the public means an end to anonymity in these places,” wrote the heads of the two organizations, Andrea Jeline and Wojciech Wiewiórowski. joint statement at the end of June. That said, AI should not be used in public spaces to gain facial recognition, gait detection, fingerprints, DNA, voice, keyboards, and other types of biometrics. There should also be a ban on trying to predict people’s ethnicity, gender, and political or sexual orientation with AI.

But such calls are made in the face of EU AI regulations. The rules, what they were was presented in April, say that “remote biometric identification” carries a high risk, which is that they are allowed but have stricter controls than other uses of AI. EU politicians have been debating AI rules for years and biometric surveillance has already become one of the most contentious issues. Once approved, the regulations will determine how hundreds of millions of people will be examined in the coming decades. And the discussion begins now.

Facial recognition has been controversial for years, but the real biometrics boom is targeting other parts of your body. Across the 27 EU member states, a number of companies are developing and disseminating biometric technological developments, in some cases aimed at predicting people’s gender and ethnicity and knowing their emotions. In many cases the technology is already being used in the real world. However, the use of AI to make these classifications may be scientifically and ethically questionable. Such technologies run the risk of invading people’s privacy or automatically discriminating against people.

Take Herta Security and VisionLabs, for example. Both companies are developing face recognition technology for a variety of uses and say it can be used by the retail and transportation industries to enforce the law. Herta Security documents located in Barcelona. claim its customers including police forces from Germany, Spain, Uruguay, Colombia, as well as airports, casinos, sports stadiums, shopping malls and hotel chains such as the Marriott and Holiday Inn.

Critics have pointed out that Herta Security and VisionLabs say parts of their system can be used to track sensitive attributes. “Many systems, even those that are being used to identify people, rely on these very harmful classifications and categorizations as the underlying logic,” says Ella Jakubowska, a policy consultant who studies biometrics in the European Digital Rights Advocacy Group. . The group is advocating a ban on biometric surveillance across Europe.

BioMarketing, Herta Security’s face analysis tool, is billed as a way for learners and advertisers to learn about customers and can “rip out” all people of a person’s age and gender until they wear glasses, as well as track facial expressions. Herta Security says the technology is “perfect” for developing direct advertising or helping companies understand who their customers are. The tool, Herta Security says, can also classify people by “ethnicity”. According to the GDPR, personal data that reveal “racial or ethnic origin” is believed to be sensitive, with strict controls on how it can be used.

Jakubowska says he challenged the CEO of Herta Security last year about the use of ethnicity and since then the company has removed the claim from its marketing material. It is unclear whether the function has been removed from the tool itself. Company documents host third yet list one of the traits that can be found through ethnic BioMarketing. Company documents From 2013 onwards he mentioned that he detected “race” before updating them to ethnicity. Herta Security, which has received more than € 500,000 in EU funding and has been awarded the EU Seal of Excellence, has not responded to requests to respond.

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button