Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why didn't he just ask the question he wanted to ask? He did it so indirectly as to make the robot say it, which makes me think he's just fear-mongering.
youtube AI Harm Incident 2024-04-22T04:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwS2Yxyswmao9XTVCR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyW1aqMoh2iZ7woqE14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgxrisFGHM4CL2m6zLl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzPcw7ZHfKa1kgLk5N4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwyD-obgIZucAHud2J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugy8HK_2oQPL19RaK8J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugw0eean9mpR5ZKazeZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyJV9vSlms1um7xXsx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugyy7cLUdV7ixDA_yZh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyD9uetntjs7nf9E314AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"} ]