Science

Stephen Hawking’s final words to humanity

World-renowned theoretical physicist Stephen Hawking passed away Tuesday, leaving behind a legacy of innovation when it comes to understanding black holes, time and space, and the universe in general.

In recent years, Hawking, who suffered from Lou Gehrig’s disease, a neurodegenerative disorder, was outspoken on a variety of issues, often addressing societal, environmental, and existential dilemmas plaguing humanity.

In 2016, he speculated that alien life exists but warned humanity to be cautious about pursuing relations with it, comparing extraterrestrials’ intentions to some of the worst exploitations humanity has inflicted on itself.

On multiple occasions, he echoed the sentiment that when the Native Americans first encountered Christopher Columbus, it “didn’t turn out so well.”

If aliens visit us, the outcome would be much as when Columbus landed in America, which didn’t turn out well for the Native Americans,” he said as far back as 2010, adding that “aliens might simply raid Earth for resources, then move on.”

Hawking was also deeply skeptical of artificial intelligence.

“Success in creating effective AI, could be the biggest event in the history of our civilization. Or the worst. We just don’t know. So we cannot know if we will be infinitely helped by AI, or ignored by it and side-lined, or conceivably destroyed by it,” he said last year.

“Unless we learn how to prepare for, and avoid, the potential risks, AI could be the worst event in the history of our civilization. It brings dangers, like powerful autonomous weapons, or new ways for the few to oppress the many. It could bring great disruption to our economy.”

In 2015, he asserted that “The real risk with AI isn’t malice but competence. A super intelligent AI will be extremely good at accomplishing its goals, and if those goals aren’t aligned with ours, we’re in trouble.”

Hawking’s fear of technology in general led him to advocate a controversial solution. He suggested “some form of world government” might be needed to save humanity but also acknowledged “that might become a tyranny.”

He was more firm on other proposed solutions, particularly that humans need to leave Earth and colonize Mars and the moon or face extinction, in part due to the destruction of the environment and the likelihood that an asteroid will eventually hit the planet.

“This is not science fiction it is guaranteed by the laws of physics and probability,” he said last year regarding an impending asteroid. “To stay risks being annihilated.”

He also said:

I am convinced that humans need to leave earth. The Earth is becoming too small for us, our physical resources are being drained at an alarming rate.

We have given our planet the disastrous gift of climate change, rising temperatures, the reducing of polar ice caps, deforestation and decimation of animal species.”

He concluded:

The human race has existed a separate species for about two million years. Civilization became about 10,000 years ago and the rate of development has been steadily increasing, If humanity is to continue for another million years it relies on boldly going where no one has gone before. I hope for the best. I have to. We have no other option.

Hawking also warned of the risks of nuclear war and genetically engineered viruses.

Despite Hawking’s stark warnings, he insisted remained positive. As he said after warning of the risks of A.I.:

All this may sound a bit doom-laden but I am an optimist. I think the human race will rise to meet these challenges.”

Hawking was 76 and lived decades longer than he was expected to after he was diagnosed with Lou Gehrig’s disease.

Via Anti-Media


Featured Image: Lwp Kommunikáció/Flickr

The post Stephen Hawking’s final words to humanity appeared first on Intellihub.

Article source link :

Article Source

You Might Also Like

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>