Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Asimov's Three Laws of Robotics are a set of rules intended to govern the behavior of robots, particularly in science fiction. These laws, as formulated by Isaac Asimov, are: 1) A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2) A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. 3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube AI Harm Incident 2025-07-26T21:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugzwe76NfuYtBb1jtXl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzGEbyz2xO0qlyZ6rx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxhYP1mDwT_HfcocOR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwnq9VnYtUCWV4PZCt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxBNclizqKSoskbjNl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzSIo63WFtgnFl8nzh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzV5zJLqzYDlKo1WhJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgyL8qtTqFB-3lxmIGN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxk5i3OSoQ8r-P_HDN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz6Z_9yGpuhmEadG2p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]