Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The hype that AI can “replace human intelligence” in almost everything is fundam…
ytc_UgxfWvPG-…
G
How can AI optimists exist given that millions of people have already lost their…
ytc_Ugz0ZrBac…
G
It is not wrong for AI to learn the way human artists do. Ai is a thing, it is n…
ytc_UgxwuveVd…
G
At this point, the only regulation that's feasibly possible now (and the only st…
ytc_Ugw98SrcH…
G
You think this is funny you just wait AI Will be having the last laugh…
ytc_UgwA83E-c…
G
Hi, I am hopeful it will be our ally however it needs to be in the hands of comp…
ytc_UgwbEWKhP…
G
im a bit baffled by how people approach this. superintelligent ai will in about …
ytc_UgxpuvyLd…
G
So are we saying that we're not allowed to create AI art because the AI used cop…
ytc_UgxHB9jo1…
Comment
1. A robot may not harm a human being.
2.A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
3.A robot must know it is a robot.
youtube
AI Moral Status
2017-02-23T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | liability |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UggGnfgJ2dwXGHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggCaMkzDkPu4ngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjduQeoeLF6YHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjMFF-zoS05A3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"unclear"},
{"id":"ytc_UghKeWexK3ypY3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ughy_952_NNC1XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugid6Flncn96MHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgijakQOO8NP73gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiZnoQWHWW-JXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UghHBbOlXt0GlXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}]