Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The part about the call center agent getting bored is where i would have to disagree. The AI agent doesn't act out of emotion to put an end to the conversation, it just acts because he is trained to recognize "just chatting" convos. Also, he wouldnt be get annoyed if he doesn't know what being annoyed means.
youtube AI Governance 2026-01-08T11:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyxNXt8GC8vqpRyz6p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwbmhQkDJxGqYZw0SB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxcbV-i9sLUyucckRp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxwPX_MYSRjreCrAiB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgyD_QaIQVOBwXsTaK14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwE6iiJHBbU9k8J16F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzSh1y1AALQzh7GxhZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzKiU8-n0papzp4ZbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx3L1gawHCykcIzrll4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwNCBqY9LTM5oQccEJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"} ]