I have been contemplating a theory about the relationship between high levels of intelligence and existential crises. I've often heard that highly intelligent individuals are more prone to realizing that life might be meaningless, leading to existential crises. In many fields, there seems to be no absolute truth—everything is relative. This realization can be distressing for some. I recall a story about a prodigy who learned to read novels at a very young age, and ultimately struggled with existential issues, leading to his tragic death in his twenties. This makes me wonder if there's a limit to how intelligent a species can become before this intelligence becomes a danger to its own survival.
My theory is that if intelligent extraterrestrial life exists, it might not be more intelligent than humans. The stress of reaching a certain level of logical intelligence might push a species toward existential despair, potentially leading to self-destruction. Furthermore, it seems that individuals who are highly logical often turn to non-logical aspects of life, such as beliefs, love, art, etc, to find meaning and cope with existential questions. Without these coping mechanisms, could a species purely driven by logic survive, or would their heightened intelligence inevitably lead them to existential despair?Adding to this, many people, including Elon Musk, claim that AI will become sentient one day. If AI were to achieve sentience and given its immense power and logical intelligence, wouldn't it be logical for it to self-destruct if it faced the same existential realizations?
I'd love to hear thoughts on this. Do you think there's a threshold of intelligence beyond which life becomes unsustainable due to existential crises? How do you think other intelligent beings or AI might cope with these issues?