“Longtermism”

The video below makes a case for “longtermism,” the claim that we should make decisions to maximize positive outcomes in the far future.

On the one hand, as someone firmly in the transhumanist camp, this has a lot of emotional appeal (at least for me).

On the other hand, the more I watched the video, the more I was struck by how much “longtermism” seems to be the inverse of the precautionary principle and how much it suffers from the same conceptual problems.

For example, the video suggests that advancing technological progress reduces existential risk. While I agree that this is likely, it is not hard to envision a world where our efforts to advance technological progress may at many points increase the existential risk faced by humanity.

Progress in controlled nuclear fission, for example, on balance may have increased the existential risk to humanity even with all of the benefits it brought.

Leave a Reply