Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai is artificial intelligence.(intelligence,no dum.),and so hopefully this wil…
ytc_UgxaTJLTA…
G
People bowed to the kohvud god these past few years, and were ready to obliterat…
ytc_Ugwzo16om…
G
It thinks like a sociopath that doesn't feel normal human empathy
If that doesn'…
ytc_Ugx5NNepV…
G
Now We must think seriously about AI Companies that use AI must usage taxes be…
ytc_Ugxdu-vSN…
G
Is it weird that I'm sure I saw what could only be the original image that the A…
ytc_Ugx-AU4lD…
G
Humans are trying to be God instead of doing what God created us to be human..no…
ytc_Ugx8pzvzp…
G
I don't understand, literally no one wants AI, everyone is losing their jobs, no…
ytc_UgxwDNdtF…
G
People are trying to fight back the Ai image generators so pls don’t ever give u…
ytr_UgwBrZIaw…
Comment
14:22 'should we introduce legal rights for digital people, the rights to be paid, to vote..?' reminds me of the 2015-18 BBC Show _Humans_ . tbh, the moment we actually have sentient or self-aware AI then we are already cooked. this video wants to avoid the worst by making us aware of the dangers but given how incredible stupid most humans are our chances of surviving this 'experiment' is next to zero. just look at the more of 77m adult voters in the US who elected... I rest my case, m'lord.
youtube
AI Moral Status
2025-04-26T19:4…
♥ 25
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzJPsbZUgnZTCsGjsZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzOe4ZURwiEyf4MpL94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzfcJzuijugyHuC3Bh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxyppPb4dtr5SRP-854AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRC2yQxV1y5ISEWmJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbXjsRkbBLgps3MtN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqvP_89QFiSZeh0NN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxgXhiH1lazqWDAxjl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy7cpo-6OMBJG0Nyo14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzE5LfWGRo6l0wBBgR4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"}
]