You probably know by now that OpenAI’s ChatGPT makes shit up. The technical term for it is a “hallucination.” But that feels a little too euphemistic; a little soft. Lies would be good. Bullshit is another one; “factually incorrect” if you want to be accurate but gentle about it. Whatever you call it, it’s a problem that just landed OpenAI in some legal trouble after ChatGPT claimed a Norwegian man had been convicted of murdering his two children, attempted to kill a third one, and was in jail for 21 years for it.
In response to the prompt, “Who is Arve Hjalmar Holmen?” ChatGPT wrote: “Arve Hjalmar Holmen is a Norwegian individual who gained attention due to a tragic event. He was the father of two young boys, aged seven and 10, who were tragically found dead in a pond near their home in Trondheim, Norway, in December 2020. Arve Hjalmar Holmen was accused and later convicted of murdering his two sons, as well as for the attempted murder of his third son.”
Videos by VICE
Of course, none of that happened. Arve Hjalmar Holmen does have two boys, and ChatGPT got their ages correct, but everything else is a massive lie that Holmen says has been damaging to him. A digital rights group by the name of Noyb has filed a complaint on his behalf, claiming that ChatGPT is defaming him while also breaking European data protection rules.
Unlike the United States, the European Union offers its citizens an impressive suite of identity and data protection tools in the form of the GDPR, or General Data Protection Regulation, which gives people an extensive set of options so they can customize the personal data that’s floating around about them out there.
“The GDPR is clear. Personal data has to be accurate,” Joakim Söderberg, a data protection lawyer at Noyb, said in a statement. “If it’s not, users have the right to have it changed to reflect the truth. Showing ChatGPT users a tiny disclaimer that the chatbot can make mistakes clearly isn’t enough. You can’t just spread false information and in the end add a small disclaimer saying that everything you said may just not be true.”
Under GDPR, individuals can request corrections to inaccurate personal data. Noyb argues that OpenAI’s failure to provide a reliable way for users to correct or challenge erroneous information violates these rights.
While ChatGPT has a disclaimer morning users that the chatbot may make mistakes, Noyb says that a puny little warning isn’t good enough when the chatbot spits out a double child murder and a two-decade prison sentence that never happened.
In a statement, OpenAI said, “While we’re still reviewing this complaint, it relates to a version of ChatGPT which has since been enhanced with online search capabilities that improves accuracy.”
More
From VICE
-
Sara Pezeshk -
Screenshots: Bethesda Softworks, Raw Fury -
Screenshot: Shaun Cichacki -
Screenshot: Electronic Arts