In a scenario of growing concern about data privacy, the artificial intelligence firm Open AI is in the middle of a legal storm. The company has been accused by an anonymous group of individuals from steal significant amounts of private information in its progress to improve artificial intelligence models.
According to a Bloomberg report, the lawsuit alleges that OpenAI violated privacy laws by covertly extract 300 billion words from the webincluding personal information obtained without consent.
The plot thickens
The plaintiffs, who have preferred to remain anonymous for fear of retaliation, are represented by the Clarkson Law Firm. In the 157-page lawsuit, they cite $3bn in potential damage.
The accusations against OpenAI are serious. It is argued that the company is risking a possible “civilizational collapse” and operates a vast clandestine web scraping system. The latter allegedly violates terms of service agreements and state and federal privacy and property laws.
Affected and accomplices
One of the main defendants in the lawsuit is Microsoft Corp., which is reportedly planning to invest billions in OpenAI. It is noted that OpenAI’s artificial intelligence models, including ChatGPT, have been trained to use the private information of users, including children, without their permission.
Warning voices in the technological world
Amid growing anxiety about how fast AI technology is advancing, notable personalities have raised their voices in alarm. Among them, Twitter CEO Elon Musk, Apple co-founder Steve Wozniak and politician Andrew Yang urged AI labs to “take an immediate pause” on work. Their fear lies in a “race out of control” to develop this technology.
Who should OpenAI pay the fine to?
If OpenAI is found guilty and ordered to pay a fine or damages, these would generally be paid to the plaintiffs in the case. In this scenario, the plaintiffs are the anonymous individuals represented by the Clarkson Law Firm.
However, depending on the nature of the lawsuit and the specific laws violated, a portion of the fines could also go to government agencies. For example, in cases involving violations of federal privacy or property laws in the United States, a portion of the fines assessed may be paid to the federal government.
Furthermore, in cases of this magnitude, there are often multiple plaintiffs and multiple injured parties. Damages may be distributed among various groups of claimants, or a fund may be established to compensate victims of alleged privacy violations.
Of course, all these details would be determined by the court and would depend on the specific circumstances and laws involved in the case.
But if what is being reported is that information has been stolen from millions of people around the world, shouldn’t those people get the money?.
In theory, yes, the people whose data was allegedly stolen would be the ones harmed, and therefore could be deemed eligible for compensation if OpenAI is found guilty. However, there are several practical complications.
First, in most cases of mass data breaches, it is very difficult to identify and contact each affected individual to directly compensate them. This is especially true if the affected individuals are scattered around the world and privacy and compensation laws vary from country to country.
Second, it can often be challenging to prove the exact individual harm that results from such data breaches. While some individuals may have suffered concrete and measurable harm, others may not have experienced any visible or direct harm.
In many cases, the solution to these challenges is to establish a settlement fund. Affected people could then apply for a share of this fund. However, the application, verification and payment process can be long and complicated.
On the other hand, in some cases, compensation may take the form of services rather than direct payments. For example, a company accused of a data breach could be required to provide free credit monitoring services to affected individuals.
Finally, it is important to remember that the details of any compensation would ultimately depend on the decisions of the court and the specific laws that apply in this case. Current laws may not be fully equipped to handle the unique issues that arise with AI and data privacy, and this case could help illustrate the need for new regulations and approaches.