Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It doesn’t need to be conscious to cause harm. The typical example given is paper clip machine. You develop an AI machine to produce paper clips but also let it devise it’s own methods. After using up all given resources to create paper clips, it starts killing people to extract iron from their bodies to create new paper clips. Weird example but you don’t need a thinking conscious systems to cause harm
youtube AI Moral Status 2023-11-01T14:4… ♥ 2
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxUZuyf6VkYWSQ7-_B4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxS8m21NBgz6jsMX1J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwKaB-o86kOOHYR8Tx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz9MVwisKBQG-lKMg94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyYN9mXhoQFwPRbJOF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxA9lFUU3MDGYnelmt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgyU8MMoL-m8zkSnmQl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzMlQtpF3lNaPJkY5B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxsX4yp3_bmjNSHjLJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy93t26HMMQtfFKaCB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"})