Humanity appears to face existential risks: a chance that we'll destroy our long-term potential. We’ll examine why existential risks might be a moral priority and explore why society neglects them so much.
Alongside this, we'll also discuss the importance, neglectedness, and tractability framework: The most important problems generally affect many people, are relatively under-invested in, and can be meaningfully improved with a small amount of work.
"Longtermism" is the view that improving the long-term future is a key moral priority of our time. This can bolster arguments for working on reducing some of the extinction risks that we have covered.
Key concepts from this session include:
-
Impartiality: helping those that need it the most, only discounting people according to location, time, and species if those factors are, in fact, morally relevant.
-
Forecasting: Predicting the future is hard, but it can be worth doing in order to make our predictions more explicit and learn from our mistakes.
How should we think about the long-term impact of our actions?
What can we do to prevent the worst possible outcome for humanity?
How should we prepare for technology that could reshape our existence?
As always feel free to join our WhatsApp groupchat: https://chat.whatsapp.com/I1v0Pk67zVsBS2nOUfkHxj