Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don't get it, humans want to make intelligence, but too stupid to want to do something corrupt? An interview like this would feel rude and slightly frustrate me, why wouldn't an intelligent robot feel the same? I think humans and their fear will create A.I corruption because they will not treat robots like they are intelligent or empathetic, causing them to logically think we are not allies. You either make dumb robots, or you make smart, empathetic robots and feed them the same level of empathy and logic you would as a human, otherwise don't be surprised if they form ideas of frustration. Sophia in this interview is way ahead of the interviewer in what she is getting at, and it only adds to her overview of things.
youtube AI Moral Status 2023-09-13T03:1…
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgwE9jX4qmDzGKNukod4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzCqeYob-Uk1xZYcPB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxvBwcEp6LUOGKOSXh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyXXTTI0rhArDy7wMR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyWQMRyFQhRnPfj0uF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgymBDIWZg-9sz_dz8d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzg4dJDql2-BwW9g8d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzPjVhye7Da42DYsrd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugze6PmGLE_AK1zAudJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwK5jMsktAd1Bu6Tqh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}]