Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Roger Penrose is right. AI is not intelligent. But not because it doesn't have c…
ytc_UgwTJwsMQ…
G
Imagine how hard it is for the person doing the facial recognition, they all loo…
ytc_UgxbF1AOv…
G
SO many warnings, from smart people, who have a lot of knowledge about AI, all o…
ytc_UgxqCZOGv…
G
Why we are running after AI AI AI if it is danger for the Human race??????…
ytc_UgwdbifWy…
G
I will give u one strong reason why this will not happen
Max population of wor…
ytc_Ugy8ic-uS…
G
The world is ruled by corrupt humans so did we think that AI would not be corrup…
ytc_UgwSstX0a…
G
i remember exact discussions only a few years ago abt how Ai might take our phyi…
ytc_UgynVuYU_…
G
So your grammar is all over the place, but from what I can gather you like ai ar…
ytr_Ugzdsz4vd…
Comment
We're modeling AI after humans and our behavior, humans are flawed, it stands to reason that what we create is equally flawed in it's own ways, so what would happen if we were to try and replicate our flaws into tools we use in flawed ways for flawed reasons.
youtube
AI Harm Incident
2025-09-11T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxJV9hbZhJadOQNl-V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzlMcTajKdR2YtXJfR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgydD-EbHQA-dR-u1Pt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw2YAif-E2WymI75_l4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxquFDrS3vRFaHoN7V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQm-PY8NijX8owfbp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxy9DxGDei1SaiLAIh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyULKKf9nANMqXJwnh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzdrZhT9uOTz9jEfjJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx5N_8asQTDsLZYhNB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"}
]