Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
They're not going to explain it to you in the first minute so I will. They ran a simulation. They didn't really give the AI any other options other than blackmail. They gave AI information in order for it to Blackmail. They told the AI one of the developers was cheating on his wife. Again this is just a simulation that was overly restricted so it's inaccurate.
youtube AI Moral Status 2025-06-05T12:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugz8Bj7SPdC4Je7NMjJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx2anC7qBNFlKinPeJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgysXcEHeNXuA6h9mhl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwlcWe5sNrEeaBM2ut4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugziql605JeLuPeUohl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy3uK-SuJayJDpYwS14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzK8GvBylT51hLe5XZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwqvVkyA2eWLEsvKxJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwCFascAELggc8RflF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzstRO6hzQqwPBzgGl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"} ]