Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can foresee the crazyest s*it coming ... and I'm not convinced mankind is matu…
ytc_UgyvjVLkS…
G
I had a mental health breakdown and attempted to delete my save file in life bec…
ytc_UgzLoYoT6…
G
As a disabled person what they are saying is more ableist than someone saying yo…
ytc_UgwSrnYeP…
G
I would say, it's not gonna happen nationwide. There are so many issues with dri…
ytc_UgzUPndEW…
G
It’s not AI that will replace your job, it’s other people who know how to use AI…
rdc_n7p20yh
G
AI can never replace a good teacher, but it can do a lot better than most teache…
ytc_UgyxGSHNP…
G
I think the car replacing the horse and horses being destroyed is the wrong anal…
ytc_UgwuoMnKp…
G
I agree with that. It's going to be a few years. Not too long. I see the advance…
ytc_Ugx6v7D0h…
Comment
Isaac Asimov figured this out decades ago. The Three Laws, presented to be from the fictional "Handbook of Robotics, 56th Edition, 2058 A.D.", are:[1]
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2) A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
AI Moral Status
2025-06-06T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxKzLbuH6_oAiou6DJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwczAd0fv3pflJHFjp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwtsp01TS67ik1FqE94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzJmnixUNSIKUgtYtJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxRSAvCNoLcYjDbyed4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwHWcb_yb453gQtr_h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzjmc6XBfpVIhfmCqp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyeCGfosjkJTgonXLN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXVWgdQDErzZZXuK94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxrBd_RBW09ruPc02l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]