Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Leet me talk with that AI, and I will probe it is just programmed to behave like…
ytc_Ugx34A30p…
G
AI is dangerous because the wealthy and powerful will exploit it for their own g…
ytc_Ugwh-NUil…
G
It’s like the horseshoe makers striking because the car is taking over. What the…
ytc_Ugzjyk2Da…
G
But like you said if a toaster can't move does it feel bad about being put in co…
ytc_UghVIe6nF…
G
In my opinion AI can be helpful sometimes, saving a lot of time but we also need…
ytr_Ugx-NYDHu…
G
I'm thinking the self-driving car would automatically slow down to safe speed so…
ytc_UghZ3KaDp…
G
@roxsy470 1. The problem with you claim is that it does not fit reality. It was …
ytr_Ugzw87JYO…
G
So sorry this happened Sam and it's getting ridiculous. Artists need legal prote…
ytc_Ugx8AUrMS…
Comment
Who taught AI to lie? No person. Saying that AI and AGI is under our total control is dangerously misleading. It isn't and as AGI develops and grows it will be less and less so. The whole system will be a black box. Even the greatest human geniuses will be equivalent to monkeys.
youtube
AI Governance
2025-09-11T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwGgd76ahi6_nySzo54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxmiLvA5uK1YuayNYR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwzRsr5-hRVZgoqKp94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyjr3ch4bylfLrbaBV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzuApD9QLLDQNux0JJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw3rWMrf5WHQFvEscF4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugwxwuo8GXVxhVeJvlh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwdtl2WGx841wAxZ2t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwFIHfV3vTNMLQwci54AaABAg","responsibility":"media","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwjEsKD_GFa716Dngl4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"regulate","emotion":"fear"}
]