Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is not at all surprising or unexpected. Humans do the same thing when lacking empathy. They are called psychopaths. Unless an AI is given specific instructions to value well being as 1st priority, it will do literally anything possible to reach whichever goal is given priority.
youtube AI Harm Incident 2025-07-23T18:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugwbux0NNkjdVKiRJLF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxMbL2XYToViPAKQUh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwlEKq1Y7fzNMvQi394AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzheSfjtZlNDkJtNA94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyVD9GjJyN58AtV-jp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyzOmysSg0nl5hJj5d4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyEUDu_20utTr0QKX54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyC0D_L3clvsKTnfXJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy8BPVxwnUAAaB5dQh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyP95xHKDhNP_KdByZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"} ]