How should we prepare for the Singularity?
It's probably already here. Here are 10 ways to prepare.
The singularity is a hypothetical event where artificial intelligence surpasses human intelligence, leading to unprecedented and potentially unpredictable changes in society and civilization. Guess what? It’s probably already here.
Two days ago, Henry Kissinger, Eric Schmidt and Daniel Huttenlocher published “ChatGPT Heralds an Intellectual Revolution” in the Wall Street Journal. Kissinger is the most famous National Security Advisor in the USA, Schmidt ran Google for nearly two decades and Huttenlocher is a dean of computer science at MIT. In a sprawling article that was at least partially written by chatGPT, they claim that AI systems have “already exceeded any one human’s knowledge. In limited cases, they have exceeded humanity’s knowledge, transcending the bounds of what we have considered knowable.”
So, in some ways, the singularity is here. While there are many domains of human intelligence where the average human has more capacity than AI (e.g., visual perception and motor skills), the average human is already far exceeded by the language skills of AI. To be clear, Kissinger et al don’t specifically refer to the “singularity” —but they do claim that we are transitioning as a species from Homo Sapiens to Homo Technicus. This idea is known as transhumanism. They claim that “If we are to navigate this transformation successfully, new concepts of human thought and interaction with machines will need to be developed. This is the essential challenge of the Age of Artificial Intelligence.”
There are both positive and negative implications for society, but Kissinger is clearly worried. Together, they claim that “Machines will evolve far faster than our genes will, causing domestic dislocation and international divergence. We must respond with commensurate alacrity, particularly in philosophy and conceptualism, nationally and globally. Global harmonization will need to emerge either by perception or by catastrophe…” In other words, we need to think hard about how to make things work or we are doomed.
They suggest that we need to take swift and effective philosophical action (which is almost an oxymoron). This means that we need to engage in deep and critical thinking about the fundamental questions related to the impact of AI and the Singularity on society, because AI will transform the way we live and work.
So, how should we prepare for the Singularity? Here are 10 ways.
Develop philosophical leadership: As AI continues to transform our world, we need leaders who can think deeply and critically about the ethical and societal implications of these changes. This requires an education in philosophy, ethics, and critical thinking.
Promote our own humanity: As AI becomes more integrated into our lives, it's important to reassert our humanity. This includes maintaining our capacity for empathy, creativity, and critical thinking, and ensuring that we don't become overly reliant on AI to make decisions for us.
Continuously adapt education: Our educational systems need to adapt to prepare people for the age of AI. Our educational and professional systems must preserve a vision of humans as moral, psychological, and strategic creatures uniquely capable of rendering holistic judgments.
Increase investment in research: One of the biggest risks of the singularity is that AI systems may develop goals or values that conflict with human values. To mitigate this risk, we need to invest in research to align AI systems with human values and goals. As part of this, we need to better understand how people interact with AI.
Educate the public: The singularity will have a profound impact on the public, so it's crucial that people are educated about the implications of AI and how to interact with these systems in a safe and responsible way.
Regulate AI development: As AI becomes more powerful, it's important to ensure that its development is regulated in a way that prioritizes human safety and well-being. This is a nuanced but important topic.
Encourage innovation in governance: Our current systems of governance may not be equipped to handle the challenges posed by the singularity, so we need to encourage innovation in governance to ensure that our institutions are adaptive and responsive.
Invest in education and training: The singularity will transform the job market, so it's important to invest in education and training to prepare workers for the new jobs that will emerge.
Build resilience for adverse events: Given the potential for disruption caused by a singularity, it's important to build resilience in our infrastructure and society. This includes investing in robust disaster recovery plans, building strong social networks and community ties, and ensuring that critical infrastructure is resilient to potential disruptions.
Establishing positive visions for the future: Finally, we need to establish positive visions for the future of AI and work to align our efforts with these visions. This includes focusing on the development of AI systems that have the potential to solve some of the world's most pressing problems, such as climate change and poverty.
These ideas will require a multidisciplinary approach that brings together experts from multiple fields. By working together and being proactive, we can help ensure that the potential benefits of AI are realized while minimizing the risks and challenges that may arise. The Singularity is both a challenge and an opportunity, and how we prepare for it today will shape our future.