When planning for the future, we often try to prepare for the most likely outcome. But this isn't always a great idea. Sometimes you want to plan for the best possible or worst possible outcomes.

You want to plan for the best possible outcomes if your downside is low and your upside is high. An example of this is investing in startups. Typically investors put relatively small amounts of money into startups, and are perfectly aware that most startups will never be able to return their investment. However, while the downside of investing in a startup is losing your small investment, but the upside is the startup becomes extremely successful, goes public or gets acquired, and returns your investment many times (10-100x) over. Your downside is low, but your upside is very high.

In this sort of domain you want to invest in ideas that sound a bit outlandish – don't look at the expected outcome of the startup, look at the best possible outcome. If you discard startups because the founders sound crazy and ambitious you will miss out on all the best companies.

Similarly sometimes the domain demands us to do exactly the opposite – focus on the downside, which can be very high while the upside is more modest. For me these would be things like climate change, nuclear war, and the one that occupies most of my mindspace these days: advanced artificial intelligence.

With global warming, even if the median outcome is fine, there seems to be at least a small (~5-10%) probability of catastrophic impact. We should plan to avoid this catastrophic outcome, even if it isn't currently the most likely scenario.

Similarly with AI there is a lot of naive utopianism. The claim is that AI will cure all disease, automate drudegry, and help humans live more prosperous lives. These things are possible, maybe even likely. But the risks are so many and so serious we shouldn't brush them away; it's worth trying to figure out how we can mitigate them.

We don't need to imagine a malicious super-intelligence to see why advancements in AI need to be taken seriously. The majority of people enjoy prosperity today not because the government or their fellow citizens have granted them that prosperity, but because they have agency over creating it. If the labor of humans is not required to run a society, then why would they continue to enjoy the fruits of that labor? If a dozen people and a few million drones are all that's needed to stay in power, how difficult is it to engineer a coup that replaces democratic governments with a few plutocrats?

And even if you think the above is not the most likely outcome, do you really feel confident saying it doesn't even have a 5% chance of happening?