Tam Hunt
Jul 13, 2024

--

You’re correct on this one, Tomas. There is no solution to the alignment problem. At this time the only hope we have, which you don’t touch on in this article, is the lack of sufficient energy to build ever larger server farms for training or inference. That may give us a few more years of time. People need to wake the fuck up now. Unfettered AGI will almost certainly kill us all. My p(doom) is same as Yampokskiy’s at about 99.9%.

--

--

Tam Hunt
Tam Hunt

Written by Tam Hunt

Public policy, green energy, climate change, technology, law, philosophy, biology, evolution, physics, cosmology, foreign policy, futurism, spirituality

Responses (1)