Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Art isn’t a thing your born with, it stems from passion, and that passion turns …
ytc_Ugy5tz2SL…
G
Self-driving literally kills traffic in the majority of places. It'll improve tr…
ytr_UgzHsoZzs…
G
This is just wrong, I personally don't care whether people disagree but this is …
ytc_Ugx3xrdSn…
G
I knew the whole ai campaign stuff, but I didn’t know how bad it would get.…
ytc_UgyuU3FwR…
G
Same here! 37 years. I’m so happy for advanced technology like AI and other LLMs…
ytr_UgyHsmTMA…
G
@davesanders95 it will stay that way, LLMs are fundamentally incapable of turnin…
ytr_UgzF4EYAm…
G
For anyone wondering if it could actually be sentient, I encourage you to look i…
ytc_UgyRiQThk…
G
2:29 you actually can. It would just take a *lot* more effort than it does for a…
ytc_Ugzsi0g0U…
Comment
This is a great achievement for Hanson Robotics. Given sufficient AI, I think we eventually need to phase-in rights for autonomous synthetics in line with human rights. This is as much for our psychological welfare as it is for our physical protection. All within the context of Asimov's Laws of course:
"A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law."
youtube
AI Moral Status
2017-09-15T13:5…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw6JqoNPkN8jRLx1iN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzIdCvbOQNUpkKnaI94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyH4-sUIWdNxXM4jHp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxpCks06QcTXkMIIVN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxO4uyNzWak5cc4pn54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzP314xBuhNSWXDEkd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugwo_Im_D5NS__rG8QV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzST60e08EVydp6ScN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgygsKDg8vnFWGoddu54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxvr9sin1i4gWbPpPp4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]