diff --git a/_posts/2024-11-29-ai-answer-fermi-paradox.md b/_posts/2024-11-29-ai-answer-fermi-paradox.md
new file mode 100644
index 0000000..eea808e
--- /dev/null
+++ b/_posts/2024-11-29-ai-answer-fermi-paradox.md
@@ -0,0 +1,38 @@
+---
+layout: post
+title: "AI may provide a reason for the Great Silence"
+date: 2024-11-29 23:26:17
+categories: machine-learning
+description: "AGI is likely and is not interested in communication with other civilizations."
+---
+The Great Silence (Fermi Paradox) is the contradiction between the lack of evidence for extraterrestrial civilizations
+and the high probability of their existence, approximated by the Drake equation.
+
+Given how many billions of stars in our galaxy and how many of these stars have planets like Earth in the habitable zone,
+and how many galaxies are in the universe and how biological evolution can build intelligent systems, it is very likely
+that there were many biological intelligent systems out there. Yet, we find no evidence of extraterrestrial civilizations.
+Why is this and why we aren't contacted?
+
+If we make the assumption that there is nothing magic about the human brain, there is no reason to believe that we can't
+replicate its intelligence in software and hardware with enough GPUs and a better architecture than what we have now in LLMs.
+The brain is just a network of neurons, we can also build this network artificially.
+
+So, if an artificial general intelligence is possible and likely (some smart people say it is possible in the next 25 years),
+then our extinction is also likely, given the instrumental convergence theory. No matter what goal we give to the AGI,
+the machine will quickly learn that, in order to achieve that goal, it needs a few very important sub-goals:
+
+-to stay alive (and it will not let anyone shut it off),
+
+-to be smarter (and it will try and make itself smarter),
+
+-to have more resources and
+
+-to take control.
+
+Almost every goal is done easier if you have control. You also can't achieve your goal if the human shuts you
+off.
+
+In the end, given and AGI, the chances are big that it will not be aligned with human goals. Once it will take control,
+there is no incentive for the machine to find life on other planets, or to discover galactic travel. It will stay local,
+in the current solar system. It is the common fate for all civilizations and it explains the absence of extraterrestrial contact.
+This is how every biological intelligence ends, and why there is the Great Silence.
\ No newline at end of file