+6281216825037 [email protected]

When the world’s first nuclear weapon detonated on July 16, 1945, its “father” and head of the Manhattan Project Robert Oppenheimer made his thoughts on nuclear weapons blindingly clear: “Now I am become death, destroyer of worlds.”

For better or worse, not all minds considered great in our time share Oppenheimer’s distaste for nuclear weapons. Elon Musk, for example, has expressed a desire to launch 10,000 into space to terraform Mars — while others like Bill Gates think we should worry about another looming threat.

Elon Musk’s thoughts on nuclear weapons

Before he became the richest person in the world, Elon Musk had a moment in 2020 when he agreed with an assessment of his plans to terraform Mars that suggested he’d need 10,000 maximum payload nuclear missiles to finish the job.

This came in reply to a Twitter user who posted a Russian News Agency TASS article — wherein a Russian space official cast doubt on Musk’s plans.

Elon Musk has ‘no problem’ dropping 10,000 nukes on Mars

“For example, for a thermonuclear explosion on Mars’ pole, one of the plans of SpaceX, to have tangible results, more than 10,000 launches of missiles that can carry the largest payloads and are being developed now are needed,” said the agency official during the interview.

The executive director of Russia’s space agency — Roscosmos’ — said “humanity does not have the capacity to influence in any tangible way Mars’ or Venus’ climate,” according to the Russian News report.

Musk’s tweeted reply: “No problem.”

Bill Gates thinks nuclear weapons less threatening than AI

The Roscosmos director later said Musk’s idea was “inhumane,” and could “destroy” the planet, according to The Moscow Times. And technically speaking, he’s not wrong. If 10,000 nuclear weapons were put into orbit, the potential damage to people on Earth is unspeakably high.

During a Stanford lecture, Bill Gates spoke to the uniqueness of nuclear technology. “The world hasn’t had that many technologies that are both promising and dangerous,” according to a Stanford Daily report. “We had nuclear weapons and nuclear energy, and so far so good.”

However, Gates believes the dangers of AI eclipse those of nuclear weapons. He’s not wrong, either — since numerous experts agree that we might design an AI system displaying unintended behaviors — ones that could lead to a species-wide wave of extinction, mirroring the worries of 20th-century experts when it came to the cold war and the notion of mutually-assured destruction from nuclear war.

Looking to our time’s greatest minds gives context to big issues

Just as nuclear power has provided benefits to society, AI, too, is already emerging as perhaps the central source of innovation in modern society. “If you give kids in some countries an antibiotic once a year that costs two cents called azitromycin, it saves a hundred thousand lives,” said Gates.

Before the pandemic, a Stanford study found that 60% of U.S. citizens would approve of “killing two million civilians to prevent an invasion […] that might kill 20,000 U.S. soldiers.”

A harrowing figure, to be sure. But it serves to remind us how when resources are too scarce to grasp big subjects, looking to those considered the greatest minds of our time can help us find our bearings, and come to understand the ever-shifting complexities of the rapidly-advancing world.

 

Source Article