Astronomical Waste

Table Of Contents

Summary

What matters for present purposes is not the exact numbers but the fact that they are huge. Even with the most conservative estimate, assuming a biological implementation of all persons, the potential for one hundred trillion potential human beings is lost for every second of postponement of colonization of our supercluster.

[[ Nick Bostrom ]] starts out with various calculations using different metrics to measure the effect and its clear no matter how conservatively one makes the calculations the number you arrive at end is astronomical and you can not overlook it.

… we can take a thicker conception of human welfare than commonly supposed by utilitarians (whether of a hedonistic, experientialist, or desire-satisfactionist bent), such as a conception that locates value also in human flourishing, meaningful relationships, noble character, individual expression, aesthetic appreciation, and so forth. So long as the evaluation function is aggregative (does not count one person’s welfare for less just because there are many other persons in existence who also enjoy happy lives) and is not relativized to a particular point in time (no time-discounting), the conclusion will hold.

The technological advancement is not at a cost of human welfare, it can take into account the things we consider of importance and value.

If what we are concerned with is (something like) maximizing the expected number of worthwhile lives that we will create, then in addition to the opportunity cost of delayed colonization, we have to take into account the risk of failure to colonize at all. We might fall victim to an existential risk, one where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.[8] Because the lifespan of galaxies is measured in billions of years, whereas the time-scale of any delays that we could realistically affect would rather be measured in years or decades, the consideration of risk trumps the consideration of opportunity cost. For example, a single percentage point of reduction of existential risks would be worth (from a utilitarian expected utility point-of-view) a delay of over 10 million years.

At many times Nick states the need and urgency for a cause that is Both important and urgent yet of often overlooked upon, “Avoiding Existential risk”.

… human extinction would be bad only because it makes past or ongoing lives worse, not because it constitutes a loss of potential worthwhile lives. What ought someone who embraces this doctrine do? Should he emphasize speed or safety, or something else?

It is mind boggling, to think that all it would take is an asteroid even a fraction the size of Chicxulub and the best case scenario is we might get thrown off few centuries in terms of development, and you already know the worst, having the same fate as dinosaurs. But Hey, wiki page for Asteroid impact avoidance looks hopeful.

For the person-affecting utilitarian, it is not enough that humankind survives to colonize; it is crucial that extant people be saved. This should lead her to emphasize speed of technological development, since the rapid arrival of advanced technology would surely be needed to help current people stay alive until the fruits of colonization could be harvested. If the goal of speed conflicts with the goal of global safety, the total utilitarian should always opt to maximize safety, but the person-affecting utilitarian would have to balance the risk of people dying of old age with the risk of them succumbing in a species-destroying catastrophe.

The Clock is ticking and each second is even more expensive than the previous. All we have is The Pale Blue Dot, If we are not careful enough it will be too late until the midnight strikes on our Dooms Day Clock.


Return to PPR


Notes mentioning this note