Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Atlas new robot are supply by china, all part...without china they would still u…
ytr_Ugxfv3puo…
G
> Because that means companies that own automation technology (not klarna) ca…
rdc_m26ol5b
G
More pathetic than this interviewer is the GROWN WOMEN AND MEN FALLING IN LOVE W…
ytc_UgzbNZMR1…
G
It's the middle class that are in trouble, AI is superior to us, not robotics.…
ytc_UgxMvt399…
G
The only real benefit I can see from autonomous weapons is if humanity ever face…
ytc_UgysoQvcv…
G
Like he said possible but highly unlikely. I think it could reach AGI strong AI …
ytc_UgwJRzAuX…
G
My two scenarios are purely fictional. Again, a best case scenario where we live…
rdc_kz0vc3f
G
In history when inequality have become too great the ones in poverty have risen …
ytc_UgxQYPQi2…
Comment
Idea: since humans basically have developed their own ideas about human rights, if we DO create conscious robots, what if we just give them the task of coming up with their own set of robot rights? It seems unfair that we would decide for them so why not just let them decide what rights they deserve and then respect their requests? And the moral dilemma of "do robots deserve rights?" would be solved, because if conscious robots ever exist, they will answer the question for us, and if they never exist, then the answer will just always be "we'll ask the robots when we get there"
youtube
AI Moral Status
2017-09-07T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzDHaJFwj5pcviHQjV4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxaMVNny9qeZzHlTwp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzTJr3S7bSjFMfmaKd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyDthKTrxCQPyrjept4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwh-e_7snxnjwPa2IF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw8YpMYlyaWojY_vQV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"unclear"},
{"id":"ytc_Ugzgm5y7Z-SSXSwa54R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxsV8WyjJJzRlugbS94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxcHqDQB2tH_7BP-5Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwyPFEzrOq0MX12_Zh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"unclear"}
]