Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI doesn't have emotions or wants unless designed to have them. So why would it ever want to kill us?
youtube 2018-04-04T07:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugy2wY0IUgphOxSqAuN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgySeyiz2QpQlII20MR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy-oeWVwWrKW8PIxQ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzW3d21sud-E-HrJYN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzrSAz85WeSwX_VKg94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzBm_yAJquStsoKJwF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugygpk9DbFcjQDVngT54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzxMFjSdPHNG5d-ccR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxJV30qMXFyg0xs8cZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzL_pXtisAljdRajp94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]