Astronomy

Future of Humanity Institute Academic Warns Against Planned Message to Aliens

Future of Humanity Institute Academic Warns Against Planned Message to Aliens

A new proposed message to aliens, according to an Oxford scholar from the Future of Humanity Institute, might be a poor idea for humanity that wants to survive. A project lead by Jonathan Jiang of NASA’s Jet Propulsion Laboratory at the California Institute of Technology came up with a novel message to send to space earlier this month in a bid to contact other civilizations.

“The proposed message includes basic mathematical and physical concepts to establish a universal means of communication,” according to the Arxiv paper, “followed by information on the biochemical composition of life on Earth, the Solar System’s time-stamped position in the Milky Way relative to known globular clusters, as well as digitized depictions of the Solar System and Earth’s surface.”

Future of Humanity Institute Academic Warns Against Planned Message to Aliens

“At the end of the communication, there are computerized representations of the human form, as well as an invitation for any receiving intelligences to react.” According to Dr. Anders Sandberg of Oxford University’s Future of Mankind Institute and anybody who has read about darker answers to the Fermi Paradox, the so-called “Beacon in the Galaxy” may not be such a good idea for humanity.

“The basic premise is that aliens may either contact us with good news or be helpful, or they could be deadly. You might say that the risk equation weighs more heavily than the good news if anything threatens the species “According to Sandberg, who spoke on Radio 4’s Today Programme.”Of course, many people would argue, well, we’re a primitive young species, and we’ve seen in Earth’s history that when a powerful civilisation meets a more rudimentary society, things typically don’t go well for the primitive society.”

“That’s oversimplifying things, but there are reasons to be worried that we shouldn’t just send out signals at random.” According to The Dark Forest Theory, even if most species are harmless, it only takes one evolved species making a logical decision to create a massive amount of space catastrophe. The Fermi Paradox was discussed in chats between a sociology professor and former astronomer and the mother of their deceased friend in Liu Cixin’s sci-fi novel The Dark Forest.

According to the professor, life will try to survive, and there is no way of knowing what other extraterrestrial species’ objectives are. Some may be helpful, while others may be unfriendly. Even if the life out there isn’t hostile, it will continue to expand in a world with finite resources, increasing the chances of conflict with those who also want those resources.

Given these circumstances, the book recommends that the safest course of action for intelligent life is to wipe out any other lifeforms before they can do the same to them. Of course, the team behind Beacon in the Galaxy believes that any alien intelligence that discovers the message will not be hostile. In their study, they add, “Logic predicts that a creature that has evolved sufficient complexity to accomplish communication via the universe would also have gained high degrees of cooperation amongst itself.” “As a result, they will understand the value of peace and cooperation.”