Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Artists werent BORN with talent. They work hard to get up to a level of artistic…
ytc_Ugx53JQyD…
G
“This guy”
Oh fuck, we’re talking about Shadiversity again… WARNING, SHAD HAS B…
ytc_Ugz_hE-9V…
G
I really thoughti was the only one who said please and thank you to ai's like ch…
ytc_UgzQXaI3U…
G
AI appears to be an enormous pattern recognition system, which then regurgitates…
ytc_UgzzfqyrY…
G
I'm not a scholar but AI is really prone to making mistakes, like to the point w…
ytr_UgzBFhE1P…
G
@pezvonpez Presented without arguement. Because it's a machine? Because it's not…
ytr_Ugz7m4gvz…
G
While I respect your perspective and passion, I think the approach you're taking…
ytc_UgzmcirBT…
G
This whole report is probably made my AI. And has like 0 information. Guys don't…
ytc_Ugyo0xyG-…
Comment
I think if you give rights to conscious robots it may increase the chance for robotic takeover. Not a conspiracy theory. If they refuse to do stuff because they consider it dangerous to themselves (seen a video about a tiny robot refusing to roll down from a table because it considered "falling off" to be dangerous to itself), they may refuse to stop this and that. Or they may refuse to do this and that. And if you give them rights they will refer to them, like "Hey, I have the rights to do this", you would't be able to do anything with them, and they might consider YOU as a threat to their presence and they may refuse to follow the laws of robotics, because they consider you harmful.
Just a sidenote...
^Hex
youtube
AI Moral Status
2017-03-24T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxOz32Mqir4mx9Q7ep4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHnrQw-5aECr2Zvt54AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwr60lU1uhM2pDe5bp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiDEONyjhXpZiAX214AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzSJnuzoFnHlr7OVHB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghQiW0rduVSOXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugh6hVu_9ssjf3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugh91X5m-k7M6XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghfBFxixrIHDXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjPKZV0GM_N-HgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}]