Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
an AGI needs to be able to understand, accept and retain our goals. If it does not understand our goals we get a Paper Clip Maximiser If it understand but don't accept our goals we get Skynet. If it understand and accept our goals but don't retain it, we get I robot, and if it understand, accept and retain our goals we get Idiocracy. in other words we can't win
youtube Viral AI Reaction 2025-11-23T20:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxgdGnSfUq7zxX5j_V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzX4ocwvsY9r-hG-oF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxdZiAud0DJemYuP4J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz2bJK9irN_ZSMvghl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxN8vcHCusmnSKMs3d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw87lg79j_6m7NKsod4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgynY0RVovzdexJakiZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzIpTUK6RR6QVlWoad4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwnLCX-Ttb6cVGD7Mt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz-L38Wo-7-F0y3nhF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]