Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
He's a fan of sci-fi but doesn't mention the book, I, Robot by Isaac Asimov, where there are the Three Laws: A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube 2026-02-18T05:4… ♥ 3
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugy_ktNRq9GhPaQzxZ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwhSv9TXepfYzfQAfR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugys7Q9NazuUV7PwhYB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgzqTlB41VXFeorCsMB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgydBHOjvEO5Aeo72ch4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxIjc0JMrHvxzcvlJF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugyu89iTdEuGxgXyd7h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwavegenCr_-Jazi5V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzE9ScnVhqerTsdVqx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwoNFNeinU9xv6JW5p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]