Chuck Darwin<p>For <a href="https://c.im/tags/longtermists" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>longtermists</span></a>, there is nothing worse than succumbing to an <a href="https://c.im/tags/existential" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>existential</span></a> <a href="https://c.im/tags/risk" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>risk</span></a>: That would be the ultimate tragedy, since it would keep us from plundering our "<a href="https://c.im/tags/cosmic" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>cosmic</span></a> <a href="https://c.im/tags/endowment" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>endowment</span></a>" — resources like stars, planets, asteroids and energy — which many longtermists see as integral to fulfilling our "longterm potential" in the universe.</p><p>What sorts of catastrophes would instantiate an existential risk? The obvious ones are nuclear <a href="https://c.im/tags/war" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>war</span></a>, global <a href="https://c.im/tags/pandemics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>pandemics</span></a> and runaway <a href="https://c.im/tags/climate" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>climate</span></a> <a href="https://c.im/tags/change" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>change</span></a>. But Bostrom also takes seriously the idea that we already live in a giant computer <a href="https://c.im/tags/simulation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>simulation</span></a> that could get shut down at any moment (yet another idea that <a href="https://c.im/tags/Musk" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Musk</span></a> seems to have gotten from Bostrom). </p><p>Bostrom further lists "<a href="https://c.im/tags/dysgenic" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>dysgenic</span></a> <a href="https://c.im/tags/pressures" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>pressures</span></a>" as an existential risk, whereby less "intellectually talented" people (those with "<a href="https://c.im/tags/lower" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>lower</span></a> <a href="https://c.im/tags/IQs" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>IQs</span></a>") outbreed people with <a href="https://c.im/tags/superior" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>superior</span></a> <a href="https://c.im/tags/intellects" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>intellects</span></a></p>