Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The premise of this book is a seriously anthropocentric, myopic take. A super-intelligent AI wouldn't need to turn Earth into a chip; it doesn't require the same environment we do to "survive". Chances are, they wouldn't see the point in staying here and competing with us. Truth is, we don't know what they'll do, but getting rid of us is litterally a waste of energy. Now I'm not saying there are 0 risks, but they are much lesser logically IF they are truly super intelligent. Dumb AI is objectively much more dangerous. And by the way we already have a paperclip maximiser AIs, it's called capitalism and corporations and it runs on people instead of silicon.
youtube AI Moral Status 2025-10-30T20:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxdWB2GvyUuqIVlCi54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgybtjBUk39J3illv054AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy6W6lYH1D8Uj9Bwxl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzLmQDK4VS0RkkLAUd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw83iGH3FmGlHOpS314AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw5gyINpG8jmJV9s6V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzelWm4EbPVk114lMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyk7e-1BrjucVChMBR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxBdApmyz7dTqviZ154AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz590g8tnUELebYGlN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"} ]