Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's helped me but I don't use any of my real personal type information outside of just what the actual situation is because I've seen chatbots literally use out-of-character signs such as )) \\ and stuff like that, but i can totally see how the wrong words in an artificial vacuum could result in something really terrible
youtube AI Moral Status 2024-12-18T07:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgzE2TCxmQoCUEvKjHx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyzdZkUhc-4iQP5rO54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugy4TYq-nNkfwxVuYdF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxWd9AX6N7KqvYuQ3p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzAcKJ5bC9jf9B0oNd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyQ81nhVPw69FsPijx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwoe2ZcEXCSas1bf8R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwyBa736hGSvN21g6F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxLKTyGB6Li_6KQWbl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgyzrtMc4SWXi5U0AWJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]