Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why frame Musk as the face of AI robber barons ? Open AI, Meta, Amazon, NVidia, …
ytc_Ugx8AO5ud…
G
AI is soulless and destroys the environment. Either make it yourself or hire a p…
ytr_UgxnEDO8H…
G
It's so frustrating knowing that the main reason people want driverless cars is …
ytc_Ugxu88lil…
G
if they are to call us Luddites, we shall give new meaning to the word. as a dis…
ytc_UgxoH5JNj…
G
Didn’t they use an AI platform to solve an open math problem in 6 hours? Terence…
ytc_UgzMBjDa0…
G
Rather pointless video in which ChatGPT has to explain over and over again the o…
ytc_Ugyo1Vtt6…
G
AI is going to be a blessing for Americans. What we need is unity around AI as a…
ytc_UgwsDZViO…
G
I hate having to use AI just because my boss wants us to use it. We were fine wi…
rdc_n8a8dqe
Comment
These developers don't even know what results their training of the LLMs will have. It's a black box for them, too.
One niggle: LLMs aren't "trying" to do anything. They don't think. They merely string statistically desirable words and content together. They only make themselves _sound_ like they think.
Which is why the developers are ultimately responsible for whatever their negligence causes through their products.
youtube
AI Harm Incident
2025-11-08T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy7ODePSietitcB0l54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyv6bOD5aAG2_Ze6_V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzZd0YD4ZfM9ZRVxOt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxrAXvpU33wtHB7lTh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwsAqHQ54GP5dBTwx94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwDe01gysZmS-JaWap4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwnXmnsyR0r5unuYm54AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyL1nu_AbwXy6xIaXN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwZEqpzAkyNtZWLFp14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwGeUQ6WrSVBwVRhdd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]