Tech News

Collective data rights may prevent large technologies from removing privacy

[ad_1]

Individuals should not fight for their data privacy rights and should be held accountable for all consequences of their digital actions. Consider an analogy: people have the right to have drinking water, but they are not required to exercise that right every time they take a tap drink by checking the quality of the water through a pipe. Instead, regulatory bodies act on behalf of everyone to ensure that all of our water is safe. The same should be done with digital privacy: it is not the average user or has the power to protect or should be expected to do so.

Two parallel approaches must be followed to protect citizens.

One is to make better use of class or group actions, otherwise known as collective compensation actions. Historically they have been limited in Europe, but in November 2020 the European Parliament he passed a measure which calls on all 27 EU Member States to put in place measures to enable collective redress actions across the region. Compared to the US, the EU has stronger laws to protect consumer data and promote competition; so in Europe, class or group actions can be a powerful tool for lawyers and entrepreneurs to force large technology companies to change their behavior, even if the harm to people would be very small.

In the U.S., mostly collective action lawsuits have been used to seek economic harm, but they can also be used to force policy and practice changes. They can work hand-in-hand with campaigns to change public opinion, especially in consumer cases (e.g., forcing Big Tobacco to accept the link between smoking and cancer or paving the way for car seat belt laws). They are powerful tools for thousands, millions upon millions of cases, when similar damage occurs, which helps to prove the cause. Part of the problem is getting the right information to sue. The government’s efforts, like the lawsuit filed against Facebook in December Federal Trade Commission (FTC) and 46 state groups, are essential. As tech journalist Gilad Edelman puts it, “according to the lawsuit, time-consuming user privacy is a way to harm consumers – a social network that protects users’ data less is a smaller product – suggesting Facebook out of pure monopoly is illegal.” In the US, like the New York Times recently reported, particular lawsuits, including class actions, are often “based on evidence from government investigations.” In the EU, however, the opposite is true: private litigation can open up the possibility of regulatory action, which is limited by differences between EU-wide laws and national regulators.

This brings us to a second view: the little-known French law of 2016 called the Digital Republic Bill. The Digital Republic Bill it is one of the only modern laws aimed at automatic decision-making. Currently the law applies only to administrative decisions made by algorithmic systems in the public sector. But it provides a draft to find out what future laws might look like. He says the source code behind these systems needs to be made available to the public. Anyone can request this code.

It is important that the law allows defense organizations to request information about the operation of an algorithm and the source code behind it, even if they do not represent a particular affected person or claimant. The need to find a “perfect litigant” who can prove harm in filing lawsuits makes it very difficult to deal with systemic problems that cause damage to collective data. Laure Lucchesi, director of the French government office Etalab, which oversees the bill, says the law had focused on algorithmic responsibility ahead of its time. Other laws, such as the General European Data Protection Regulation (GDPR), focus too much on individual consent and privacy. But data and algorithms need to be regulated.

The need to find a “perfect litigant” who can prove harm in filing lawsuits makes it very difficult to deal with systemic problems that cause damage to collective data.

Apple he promises in an ad: “Right now, there is more private information on your phone than in your home. After running your locations, your messages, your heartbeat. They are private things. And they should be for you ” The idea is that companies don’t really care about your personal data, so they may look like they’re locked in a box. The value is in the inferences from your interactions, which are also stored on your phone, but that’s not your data.

Another example is Google’s acquisition of Fitbit. Google promises “no use of Fitbit data for ads,” but the advertising earnings that Google requires are not dependent on individual data. As one group of European economists saysThe London Center for Reflection, in a recent paper published by the Center for Economic Policy Research, said: “It is enough for Google to relate aggregate health outcomes to unhealthy outcomes, as well as a subset of Fitbit users who have not had the opportunity to use their data. use, to predict the health outcomes (and therefore the chances of targeting ads) for all non-Fitbit users (billions of them). ”The Google-Fitbit agreement is essentially a group data agreement. Google is positioning itself in the core health data market while triangulating different data sets and allowing it to make money from inferences used by the health and insurance markets.

What politicians need to do

Preliminary drafts have sought to fill this gap in the United States. In 2019, senators Cory Booker and Ron Wyden introduced one Algorithmic Responsibility Act, which then stopped in Congress. The action would require algorithmic impact assessments in some companies to verify bias or discrimination. In the US, this crucial issue is first and foremost addressed in laws that apply to specific sectors such as health care. The risk of the algorithmic aspect has increased due to the different impact that the pandemic has had on U.S. population groups.

At the end of January Public Health Emergency Privacy Act Senators Mark Warner and Richard Blumenthal re-entered the Senate and the House of Representatives. This law would ensure that the data collected for public health is not used for other purposes. It would prohibit the use of health data for purposes that discriminate against, unrelated, or intrusive purposes, such as commercial advertising, e-commerce, or efforts to control access to employment, finance, insurance, housing, or education. That would be a great start. Going beyond that, the law that applies to all algorithmic decisions, inspired by the French example, must have a strong responsibility, strong regulatory oversight of data-driven decision making, and the ability to oversee and monitor algorithmic decisions and their impact on society.

Three elements are needed to ensure strict accountability: (1) clear transparency about where and when automated decisions are made and how they affect people and groups, (2) a call for citizens to make meaningful contributions and to justify their decisions to those in power. , and (3) the ability to enforce sanctions. Essentially, politicians will have to decide, as recently suggested in the EU, what is a “high-risk” algorithm that should meet a higher standard of analysis.


Clear transparency

The focus should be on the public examination of automatic decision-making and the types of transparency that give rise to accountability. This leads to the existence of algorithms, their purpose and the training data behind them, as well as their impacts, whether they have led to different results, and in which groups.

Citizen participation

Citizens have a fundamental right to call on those in power to justify their decisions. “This right to demand answers” ​​should not be limited to participation in the consultation, as people are asked to contribute and the officials move forward. It should have the power of participation, where public access must be ordered before the deployment of high-risk algorithms in both the public and private sectors.

Penalties

Ultimately, the power to punish is essential for these reforms to be successful and accountable. It should be mandatory to establish audit requirements for the processing, verification and correction of data, to provide the auditor with this basic knowledge and to empower supervisory bodies to enforce sanctions, not only to prevent subsequent remediation.


The issue of collective harm caused by data affects everyone. The Public Health Emergency Privacy Act is a first step. Congress should then take lessons from the implementation of this act to develop laws on collective data rights. Only through this action will they be able to avoid the inferences made by the US data companies about the ability of people to access housing, jobs, credit and other opportunities in the coming years.

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button