Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Smart people talk about the dumbest Topic imaginable ! AI is a scam ! Maybe we s…
ytc_UgwiiHHD-…
G
"this is the way of the world, ai generated content. Get with the times or dispe…
ytr_UgzSOJeOx…
G
why exectly is everyone so mad about ai art? yeah it is using ur art for their m…
ytc_UgwewkUqL…
G
mothering/caregiving and jobs that take emotional intelligence and intuitive cap…
ytc_UgzGxVauZ…
G
Yeah its definitely better than waiting for 15 minutes for some random person to…
ytc_UgwtzAGSB…
G
if robots have artificial intelligence then they should never hurt humans and th…
ytc_Ugwca-XJC…
G
We should’ve been boycotting this AI mess. It’s not even necessary, we would’ve …
ytc_UgwqcVlZg…
G
Un incidente muy curioso sobre todo porque está relacionado con temas tan intere…
ytc_UgxiJSMvU…
Comment
Also, I find extremely dumb to give feelings to robots, or the ability to feel pain. It's a human defect, a living organism's defect. If you could replace your burnt foot for a new one, there wouldn't be a need for pain.
A robot doesn't need it, of course, it just needs a system to alert it when harm is near, so it will avoid it or at least save its hard drive or remove the damaged part.
Robots need rights (or they will in the far future) but not the same rights as humans. Our right is not ours because we earn them somehow, but because we take care of our own species. Humans aren't special but we are important to other humans.
Robots will need the right to exist, and to be repaired instead of throw it away, or donated once the original owner doesn't need it anymore.
youtube
AI Moral Status
2017-02-23T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UggcVwGpN4yVdngCoAEC.8PKOFm5yM3m8PKPgzk1BPe","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UggcVwGpN4yVdngCoAEC.8PKOFm5yM3m8PKPm_UzYkB","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UgjOmgeCBLOgcXgCoAEC.8PKOBNs0ocA8PKQRlUGktX","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UggkUCGqJef2Z3gCoAEC.8PKO6S9Sv6L8PKQ49D6PR9","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugjvx2Y-evu9F3gCoAEC.8PKO1ozsZsx8PKRhHqYz0E","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugh8FxkHzWzJP3gCoAEC.8PKNUTDaY9o8PKNwR98YZS","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgjIHPBY7yH_XXgCoAEC.8PKNTfq5Gr78PKPWflncTJ","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugh6w3fRvaCuYngCoAEC.8PKMLLcHo098PKNRxF1aYy","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_Ugjk_jq3mcxXRngCoAEC.8PKMB6n3zor8PKNyZNxAkB","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgjpVgvcSYi_hHgCoAEC.8PKKTJFWyhd8PKOsHVq-4H","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]