Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Isaac Asimov's Three Laws of Robotics: Back in 1942, before the term was even coined, the science fiction writer Isaac Asimov wrote The Three Laws of Robotics: A moral code to keep our machines in check. And the three laws of robotics are: a robot may not injure a human being, or through inaction allow a human being to come to harm. The second law, a robot must obey orders given by human beings, except where such orders would conflict with the first law. And the third, a robot must protect its own existence as long as such protection does not conflict with the first and the second law. (from a forum hosted by Britannica)
youtube AI Governance 2023-03-30T13:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwmaTKbsho7xfXRlm94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxzFpS_ytoaTy-rhp14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_Ugxs6l_cwP1aYITLMDp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzlfmBj3Qt_EK4gE7h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyNIZ-d280ARnkhsUp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxqN3eWIHvH0ljO4894AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx0YQSqTn7YiQPyAkB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyys0bLM5BI-uk7RYl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxTGLv5ZCtrv8kldNN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwLY6Tm__sG6qMa16t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"} ]