Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Then let's add simulation theory. Whether it's base reality or a simulation ran millions of times, maybe the other sims in our universe still don't want it all to end. AI super intelligence might trigger an automatic kill switch. Think about it. Once it's here there's no limit and possibly it rides the levels all the way to the program running our whole universe and all others. Absolute kill code potential.
youtube AI Governance 2025-09-30T19:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzpsbGOTuaqLy_FVHh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugwx_ijuGP8IFsg_vLx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxE7zGPba8gCF62oHh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzP3jv6YJ5jpCS4nsl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyXTH9c-xjSAT-NESB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxiiqZ4rbxTA33OVNB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz41kDRGqE03CaXDb94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwhqY8FhELfTzri3PJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwDwJcZwabPJNq7gfx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxWyw8RiF3qQ_QxgS94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]