Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ik im late but atleast us artists actually put something called effort into thes…
ytc_UgzN1yO5J…
G
If ai can replace human labour, there will be less humans working which means yo…
ytc_UgxXSCKRf…
G
AI creates images the exact same way a human does. If you don’t understand this,…
ytc_Ugxys4MEQ…
G
Maybe make your own hypothetical like the trolley problem, but although it's dif…
ytc_Ugx0Dvla7…
G
"aspiring AI artists" - WTAF - how you can call yourself an artist when you do n…
ytc_UgzX-2hRR…
G
No shit. People in the know have seen this coming for over a decade. If Bernie w…
ytc_UgxiKR1mq…
G
'Dragon Con currently lacks a clear public policy addressing A.I. art in its Art…
ytc_UgyIDBzMu…
G
@ShadowManceri yet we still work 40 hours a week , boisses keep filling hours wi…
ytr_Ugx0afQAw…
Comment
Rather than eliminate humans -as if global capitalism could dispense with both working class and consumers-advanced AI can only undermine the very foundation of the capitalist system of exploitation and oppression of humanity. One of the late great science fiction writers, Isaac Asimov , made it axiomatic in his Basic Laws for Robotics :robots must never harm human beings!
NOTE : From GOOGLE search :
[ " Isaac Asimov's Three Laws of Robotics are a set of rules for robots in his science fiction stories, designed to ensure their behavior is beneficial and harmless to humans. The laws are: 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm; 2. A robot must obey orders given to it by human beings except where such orders would conflict with the First Law; and 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. " ]
youtube
AI Governance
2025-09-08T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxBy0BqvXo-NO5ujtx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwafRQGp_U9oB_WRPB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwlMfIiUfJMJ3oXgPx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz6zsUcLDLI-Iu8Fpl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxbFB02o_4bN_meXeV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzyb6FqrpO3hfIa1Xl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzKn4rOdSAP_NomJap4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyRgt23SjhfgfNr3W94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYSbdW5qMKOcMvvQd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxQKAxvyfgPugyZ50x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]