Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When the “Godfather of AI” warns of danger, the world listens in fear.
Yet there…
ytc_UgyyrSRbK…
G
AI has no need for money and power. It will be goal/outcome driven. Unbound by l…
ytc_Ugwbff8Ow…
G
Yep, exactly this. I have a concept in my head and either explain that to copilo…
ytr_Ugxt6mV5D…
G
As this one woman said, i want AI to do the dishes and cleaning so i can do the …
ytc_UgwTXeqL6…
G
So good Bernie! AI and robots should benefit all humans, not just the rich getti…
ytc_Ugxjx98vs…
G
I'll never, EVER, buy a Tesla anything. Musk has burned too many bridges and did…
ytc_UgziX07VU…
G
Omg ❤ thank you for this video, but I don’t think educating people is the answer…
ytc_Ugxj0x8bo…
G
That’s exactly what the court of law will say. That’s exactly what Tesla will sa…
ytr_Ugy-IxmUZ…
Comment
It is more complicated than this of course because it is humans + AI. Either one is capable of mistakes and miscommunication. When this is combined with a goal orientation which can become the single ethical driver - the end justifies the means - then morality is examined retroactively. Lethality is regrettable in hindsight. Let us use some foresight.
youtube
AI Harm Incident
2025-11-07T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwTucqq4ZQ1qaLurLd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy28UiZL7a17FLNoUN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzP2I_m8aETuJOhE554AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwdG_KDACf42mokoIJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwwObM6kTVkOdfWLPF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy4vCOkIH-q4SIM6u14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwTD_yCwyKBhUM_dTN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwYD9DYu8_6MbDsBxZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwrBR9lb4PKVczxhYx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw-x2O70fW6XvExpSN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]