Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So here is the question that I know the answer too? Do you honestly think AI wil…
ytc_Ugzxv7Em0…
G
Thank you for your comment! Sophia, the AI robot in the video, indeed has a char…
ytr_Ugy2JrqAU…
G
Never had a problem with AI: a robot never stopped me from drawing what I wanted…
ytc_UgwwPXcHw…
G
I thought it was the immigrants that were stealing all the jobs, another excuse …
ytc_Ugz2TKmkZ…
G
This is just a user skill issue! AI can do it if you can engineer your prompt be…
ytc_UgyIbcGVe…
G
My new wife is a robot... she beats me, I wont ask for a divorce again,I swear..…
ytc_UgzQf9fK_…
G
LLMs will never become self-aware and ambitious. Intelligence is far more comple…
ytc_UgwfE3mIH…
G
I heard you talking about life meaning, one thing I learned from Bashar (Darryl …
ytc_UgxKd4lEl…
Comment
Well humans really need a reason to try to ban together right now, whether it’s AI robots or extraterrestrials. So there could be an upside to this. And really what can I possibly do about this right now if I don’t believe legislators can or will do anything for the good of humanity. It will probably take something completely devastating to get people to care about each other.
youtube
AI Moral Status
2025-06-05T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz4aQzpTxkqd5rQ_dV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw5Bq9oSRu3NuLOcS94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgziJsgn60S5eXj9wfV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy-DhGuquoZXlTlOd94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwvBJMlGe_cEJcWR8h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyjpO11z9j6nSQEEUN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgyUeAg9jrvEJ-VB5rp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwy9sgUNOpl8c2DuzR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxSC9h-aiHL2L1sdHl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwy-P0yFzyJ3aACXzB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]