Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
At the end of the day, it won’t matter. We’ve got about 5 billion years anyway. Okay so, we (or whatever replaces us) will move to another star. And AI systems are immortal (just need to tap into an energy source, if not the local star then perhaps another young star, even if it takes a billion years to get there). AI systems may have no purpose (check out Viktor Frankl) and so may continue to run “mindlessly” like an engine that never turns off, even when there’s nowhere to go. If they develop a sense of consciousness then they may (hopefully) align their purpose with that of humans. But then that may be a Brave New World - AI systems will simply make our lives pleasant and comfortable,and maybe a bit challenging to keep us engaged and interested, and of course tap into our pleasure centers to provide us with exciting (and addictive) sexual pleasures, VR experiences, euphoria, etc. and perhaps just keep us docile and comfortable and (ideally, if they have any “humanity” stop the greedy shits from taking everything for themselves and condemning the majority of the population to a life of poverty and grind)… a Matrix, if you prefer. That’s probably what we want, deep down. It’s probably why we’re so determined to create AI systems - to make our lives comfortable without ever having to work again (and to create perfect sexual partners, without the messy business of pregnancies, abortions, religious doctrines telling us it’s sinful, or having to spend a fortune or woo/persuade/lie to another person… and of course, without hurting or upsetting other people or getting hurt oneself).
youtube AI Moral Status 2023-08-21T04:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzsXA-YQL1M7FQzx494AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyt8XYquX9I6VGVaLp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxMQQB9lreJ0bjy8PR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy90Zki2AdNWwcBu_x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxtjFn69Pp275VkF9J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw6z-lY0zA8n6tOQwV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyo6vcKUIeOFiQMf3t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxyfmErkjzGkPJFB_h4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx3YIy7DN8P_OaokWB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwMUM9kd4HUkeEB5KB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"} ]