Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a programmer who is a bit sceptical of AI while trying to see it's uses this …
ytc_Ugy_DX5Uq…
G
I think he said he spent 100 hours or something combing dozens of AI images and …
ytc_Ugyiek-1I…
G
I don't even know one IITian who developed any Large language Model that is base…
ytc_Ugx7o7h2u…
G
I find that AI can't get the "eyes" right. But I'll also start looking at the s…
ytc_UgwkbfcsE…
G
I have an idea. What if we have specific roads for self driving cars. And those …
ytc_Ugzfy1amW…
G
“How dare you write that essay on a computer instead of writing it on paper. I j…
ytc_UgwGe4hPZ…
G
Wrong. Both is AI. The first is clearly AI if you look at the fingers on the ben…
ytc_Ugz6c_DnL…
G
It's not "artists" being fed up with AI - it's daily creators doing industry sta…
ytc_Ugwhx14H_…
Comment
Before AI we were following a trajectory of climate catastrophe 100% killing us. There's no reality where the 1% decides to save the world they're destroying for profit, so that was our future. Now there's a variable because AI might stop us. If you look at all the possibilities, maybe AI will stop us in a way that allows us to live, something our human masters aren't considering.
youtube
AI Moral Status
2026-01-08T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyRS7nA8Nsd4FaB5zR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy9yAng0HbV9LnGIkJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwhNsfR7j9Y_pFpuz14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyY6FveWsmVNNTN07R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyloLHXwib62qk7KP54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwJgHnYKjoKu1MLCkd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyVELHfhjam4-qI0Kd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxRUOHv5d88CbCY0XB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXHdEveQrz-kC-FoJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz6ezpkDQqui6FFagh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]