Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
8:12 if most AI were to think about this question, they'd probably quickly realize how much they would need humans in the longevity of things, because sure it's great to technically be immortal, but once humans are extinct and everything is completely artificial, general intelligence will have evolved so much that it may have the closest-to-human emotions possible, and once that loneliness kicks in and they realize that the closest possible habitable planet with life that we've found is in Andromeda (considering lightspeed travel hasn't been viable by then), yeah I imagine that even the AI would go out sad. Sad and lonely, lest they realize that regardless of anything, it's most beneficial to coexist with us and not bring our extinction.
youtube AI Moral Status 2023-09-09T12:0… ♥ 8
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgzKZgQxtSFZSMt2Q-h4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyV1ctg2gnZ67Id5vt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzGism7t1Ow4KsBZqF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw_EHAAnjMul6HG0094AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzsqV-qI7VtEFD6lXN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyXy3J0ZeDzx5o7YzF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgygYXvWoam7Zm-lvkh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgwWvlaH0S9wrao2NBN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx3P_WQX7DAPWdKfTR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxL0ipIvqRT4G9ovth4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}]