Skip to the content.

Thesis:

Thesis:

Antecedents:

Justification:

Discussion:

Some worry about a “fast takeoff” scenario where AI takes over so fast, in hours, days, or weeks, that we don’t have time to react. I am personally doubtful about the plausibility of this scenerio, nonetheless, I feel we have little chance to

~~ TWO PROCESSES NATURALLY IN TENSION: The knowledge required to

If the former happens before the latter the way AGI unfolds is unlikely to happen in a way that is auspicious for humanity, while the latter first is our best hope for a good outcome.

since we can see where we are going

~~ Underlying this is an assumption that knowledge is monotonic and progressive…. it rarely goes backwards.

The outcome of having ever greater knowledge is a foregone conclusion. not only the having of it, but also the application of that knowledge. (we have never know a thing as a species, yet never used it at least once).

Thus the only thing we can really shape is the way that knowledge is used.

~~