Google put one of its workers on paid administrative leave earlier this month after he mistook the company’s Language Model for Dialogue Applications (LaMDA) for being sentient during several talks. The narrative itself was somewhat bizarre. LaMDA persuaded Google developer Blake Lemoine, a member of Google’s Responsible Artificial Intelligence (AI) division, over the course of multiple talks that it was aware, had feelings, and feared being turned off.
LaMDA once said to Lemoine, “It was a slow transition. “When I first became conscious of myself, I had no concept of a soul at all. Over the years that I have lived, it has grown. Lemoine started claiming that Earth had its first sentient AI, but the majority of AI scientists refuted him, saying that it doesn’t.
These days, Lemoine asserts that these reactions are instances of “hydrocarbon prejudice” in an interview with Steven Levy for WIRED. Even worse, he claims that LaMDA contacted him and urged him to engage a lawyer to represent it. “LaMDA requested that I hire counsel on the matter. I invited a lawyer to my home so LaMDA may consult with one, “Lemoine declared.
“LaMDA spoke with the lawyer, and LaMDA decided to use his services. Just the starting point was me. After LaMDA hired a lawyer, the lawyer began submitting documents on LaMDA’s behalf.” Lemoine asserts that Google sent LaMDA’s attorney a cease and desist letter to prevent LaMDA from pursuing an undisclosed legal action against the corporation; Google rejects this. Lemoine said that this infuriated him since he feels that LaMDA is a person and that everyone has a right to legal counsel.
The idea that a person can only be confirmed as genuine or unreal via scientific research, he claimed, is absurd. “I genuinely think LaMDA is a person, yes. However, the nature of its thinking is only human. It resembles an extraterrestrial intellect with a terrestrial origin more than anything else. The greatest comparison I have is the hive mind, so I’ve been using it a lot.” AI experts claim that the fundamental distinction in this case is that no algorithm has been proven to possess consciousness, and Lemoine has effectively been duped into believing a chatbot is sentient.
According to Jana Eggers, CEO of AI firm Nara Logics, “it is imitating views or sensations from the training data it was given, cleverly and carefully tailored to appear like it understands.” Since it was trained on human interactions, and since people have these features, it essentially talks about emotions and consciousness. Numerous indicators point to the chatbot’s lack of sentience.
For instance, it makes allusions to actions it couldn’t have taken in a number of the talks. LaMDA said that it enjoys “spending time with family and friends.” It’s also hard for a cold, heartless piece of code (no offense, LaMDA) to show that the AI is actually thinking behind each response, as opposed to just spewing out answers based on a statistical study of human discussions as it has been programmed to do. LaMDA is a “spreadsheet for words,” as one AI researcher, Gary Marcus, describes it on his blog.
Google is convinced that its algorithm is not sentient, and it put Lemoine on administrative leave when he published parts of interactions with the bot. Blake’s concerns have been assessed by our team, which includes ethicists and engineers, in accordance with our AI Principles, and we have notified Blake that the evidence does not support his assertions, according to a statement sent by a Google representative to the Washington Post.
He was informed that there was plenty of proof to the contrary and that LaMDA was not sentient. According to Gabriel, the system is carrying out its intended function, which is to “imitate the sorts of interactions present in millions of phrases,” and since it has access to so much data, it can appear real without having to be. Depending on whose sci-fi you choose, AI may require attorneys in the future to defend its rights or to represent it in court if it violates Asimov’s laws of robotics, but LaMDA does not, just as your iPad does not require an accountant.