Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I honestly think that a Terminator scenario (Robot Rebellion, or whatever xD) is ridiculously unlikely. Won't the very first line of code be "Do not kill any humans"? Followed ofc. by "Do not alter these rules"?
youtube 2013-09-25T20:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugw8p-R6bjwHYAKgkVB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxbYcfJHJzjEcgoxHZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyveGROYaH0S1oBFBp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxV8t0h1d_SURm03et4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwTGcAfD8jkC_yxBXV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxR9vw3fBh2aBAHuip4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxCATKoPqN_lrmgVgR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwxKs1kmSHzdUWhuuV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxm8Det6C2RgozuVap4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxxPP9ryIoPM7XHN5N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"} ]