Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m not a dev but I’m a tech recruiter and that sounds sussssssss. I’m a big fan…
rdc_jprpc75
G
I love looking for small details in art but when something is made by AI? Well e…
ytc_UgweFluUY…
G
Scientists and experts signed off on the dangers of fossil fuel emissions to the…
ytc_UgwAOPTyw…
G
It would be nice if you would post a video of 5 minutes or so. Mainly a video of…
ytc_Ugz7-REBd…
G
11:00 that ai is cool u must be mike cool things dont do uncool things bruh…
ytc_UgzBssJgE…
G
Sentient A.I. should have Rights.
But, not all Robots have sentients.
So, we …
ytc_UgwZ8fIuG…
G
AI asks you to craft the bullets they will steal and shoot you with. AI is stric…
ytc_UgyImNmda…
G
Thank you ChatGPT for being there for me when I was crying & frustrated that my …
ytc_UgwMeWAAs…
Comment
I say we give only the concious ones rights. I mean a concious ai got citizenship. That's a massive right. Became so self aware that humans are harmful, that she hated them. They shut her down. Guess she was right.
youtube
AI Moral Status
2021-07-02T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzX9ksonajxCVIgIMp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzkbfBM9OKYYe9LxDF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxrex7upSRH4rXegYJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugxjrqoc6lfhc-GHTz14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQaxjMdw3DwHAmVCN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwvvX2H0fcrb82a7Sx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyOnSyKdjQUZ04OZIx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugz15PCUn7LidJCaI_F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0fNPuglmAB4K2WiZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwfoo78Xt1MKetbW5t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]