Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The question is why you would create an AI that would have wants and needs that could be opposed, the ideal AI isn't a person, it just does what it's told like any appliance or computer program. Even if it has a generalised intelligence, it will be motivated by whatever we program it to be. It's not going to spontaneously develop human-like emotions and feelings.
youtube AI Moral Status 2020-07-08T12:2… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugxh0ApZshMLi2I9lg54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzo3L3CCqsdlN-m7-14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgzPbDSm42mBm4VhNHJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyETUx6eRP1883qokt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyAy8gYxtFGpxiBvxR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyxtIRrGQn8pPkdaDh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_UgwDnfchUIYpixE_E414AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgxBJzobPDCJQGthwmV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugy1GsZaucTfOLezmmN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz2BJmqXzosAj8mDTJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}]