Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Only idiots who assumes Transformative fanarts that made for absolutely anything…
ytc_Ugy1M1TmT…
G
It seems like you're referencing a common AI trope! Sophia's perspective on wisd…
ytr_Ugx3ehtIG…
G
Am I the only person who’s tired of hearing delusional boomers warn us of the da…
ytc_UgxpSmRKB…
G
Only one question to AI artists: 'Can you repeat this with other stuff?' If the …
ytc_Ugwt2FHj5…
G
fuel crisis is man made u now why? AI ECNOMY <3 Chinese EV cars are invading can…
ytc_Ugw1H-b9q…
G
I am of the opinion that AI prompted videos, text or in general anything that se…
rdc_o5r7ka7
G
On behalf of the irreplaceable trade jobs. Welcome, come get dirty with us, and …
ytc_UgxBvYJ4Q…
G
I don't understand If all jobs are done by AI. Who is going to consume products.…
ytc_UgyzzHc3e…
Comment
It's a double problem, since humans have a cognitive bias to filter for things that confirm what they want to believe, and LLMs have an inherent approval-seeking tendency to output to users what they want to hear. Human users also overestimate how intelligent those hyped-up algorithms actually are: as we see, they don't really have a sense of self or continuity or even that they're giving inconsistent information. People shouldn't be relying on them for any crucial or cognitively demanding activities.
youtube
AI Harm Incident
2025-11-25T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwiBUF0TkF7ynX_3bR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGXcH9mby8-4hYqwl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyU-RSLLQpl-nEJiAp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzUzg1e1D9UDCmiE9B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgykUh1RLKYbRB0lmw54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwgWM_M2XaTwdzgb1d4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxjx6V7LSQZJzWnwU14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwrYHOCjSObdfqFDvl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"approval"},
{"id":"ytc_UgzJguInZMTpcqbcj7N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqCf9Pz6vptw4ugTN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}
]