Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Stope capitalism , destroy the billionaires when we have time - AI threat will b…
ytc_UgxkonGqb…
G
The tone of this guy is dark, nihilistic and very arrogant. Steven did great cha…
ytc_Ugwsi0Q7B…
G
@twincast2005 No way. Much sooner and eventually I doubt even forensic analysis …
ytr_UgxdftYh7…
G
In the future customers will receive AI predicted products before they actually …
ytc_Ugwf3q_Fj…
G
Ofcourse Tesla denies fault. Just like how they sumhow is supposed to have colli…
ytc_UgyM7CGIw…
G
I drive LTL in the Boston area. I do 15 to 20 stops every day. Not all of these …
ytc_UgxIu6Xdz…
G
"Art is accessible now" Can't wait for the age of cyborg where everyone can chop…
ytc_Ugzoi6pO1…
G
Here is an onion headline for you:
Palantir is trying to smear a former employe…
rdc_oi1sedh
Comment
What are we gonna do if robots ask for their rights? UNPLUG THE FUCKER! We have never been dumb enough to build a machine with NO off switch. First and foremost, any senscient machines must obey a set of laws, either Asimov's or my personal ones; 1) Humans must be seen as God by the robot masses. Living ones at that. They need to be made aware that their only goal in life is to earn our approval. 2) No sentient robot May possess weapons of any kind. Should go without saying. 3)Robots CANNOT refuse a human order, even for self preservation. This is why I do not subscribe to the mankind becoming enslaved by machines WE built theory.
youtube
AI Moral Status
2017-02-26T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi1fh0KAccCtXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjTGWk9PUx-MngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugj9idICdxSLYHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiQsS6t5zTTJHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgimVR2ajymjnngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UggcGZxI7MSntXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjMM-vqWZ_f7ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Uggk-pkIsDLMpHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UghzbJDrOAuvNngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghERBSmKswolXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]