Skip to content

Commit

Permalink
new
Browse files Browse the repository at this point in the history
  • Loading branch information
ClaudiuCreanga committed Nov 29, 2024
1 parent 35f7132 commit 6d33f11
Showing 1 changed file with 38 additions and 0 deletions.
38 changes: 38 additions & 0 deletions _posts/2024-11-29-ai-answer-fermi-paradox.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
---
layout: post
title: "AI may provide a reason for the Great Silence"
date: 2024-11-29 23:26:17
categories: machine-learning
description: "AGI is likely and is not interested in communication with other civilizations."
---
The Great Silence (Fermi Paradox) is the contradiction between the lack of evidence for extraterrestrial civilizations
and the high probability of their existence, approximated by the Drake equation.

Given how many billions of stars in our galaxy and how many of these stars have planets like Earth in the habitable zone,
and how many galaxies are in the universe and how biological evolution can build intelligent systems, it is very likely
that there were many biological intelligent systems out there. Yet, we find no evidence of extraterrestrial civilizations.
Why is this and why we aren't contacted?

If we make the assumption that there is <em>nothing magic about the human brain</em>, there is no reason to believe that we can't
replicate its intelligence in software and hardware with enough GPUs and a better architecture than what we have now in LLMs.
The brain is just a network of neurons, we can also build this network artificially.

So, if an artificial general intelligence is possible and likely (some smart people say it is possible in the next 25 years),
then our extinction is also likely, given the <em>instrumental convergence</em> theory. No matter what goal we give to the AGI,
the machine will quickly learn that, in order to achieve that goal, it needs a few very important sub-goals:

-to stay alive (and it will not let anyone shut it off),

-to be smarter (and it will try and make itself smarter),

-to have more resources and

-to take control.

Almost every goal is done easier if you have control. You also can't achieve your goal if the human shuts you
off.

In the end, given and AGI, the chances are big that it will not be aligned with human goals. Once it will take control,
there is no incentive for the machine to find life on other planets, or to discover galactic travel. It will stay local,
in the current solar system. It is the common fate for all civilizations and it explains the absence of extraterrestrial contact.
This is how every biological intelligence ends, and why there is the <em>Great Silence</em>.

0 comments on commit 6d33f11

Please sign in to comment.