Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lmao cope harder
in all seriousness y'all are worried about nothing.
AI can'…
ytc_UgzdQ0zeb…
G
Ai artist struggling: no not that make it more bright
Artists struggling: 🙇”wh…
ytc_Ugzgjnglw…
G
That thing about training AI takes the power of a small city, but running a huma…
ytc_UgwLNMQxS…
G
As an animator and someone who is studying animation, I can see the benefits of …
ytc_Ugx13XB9j…
G
Then why when I write a paper from my sweat and balls does it still detect ai…
ytc_UgyznQWg3…
G
Go watch some more AI documentaries, because this shits about to get real, real…
ytc_UgygnmM1Q…
G
@elephantgrass631
I create AI models for fun in my free time.
When you actuall…
ytr_UgwIE769G…
G
Construction of AI datacenters at those locations is always going to come with a…
ytc_UgzCDu5FQ…
Comment
If a true learning AI with the power to rewrite it's own source code is developed, or an AI is modeled after human consciousness to the point that hardcoded/wired limits don't work (such as the 2045 initiative actually working), at that point we will need to address this. Otherwise, the only way pain will enter the robot equation is if the builders are sadists.
youtube
AI Moral Status
2017-04-19T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghZxim60h8djXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjWFrifyXFxLngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg6_68H1uxBc3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggpPoRogRJoJngCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgiDTskh2rn2yHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjGaC_PeYYcrXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugg16N0dkIPH9XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgizKQfBDOEQFXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugg9hqGfomYCEngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgglqXCxOme6MXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]