Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Kawaii.Cherri.Blossom No, I mean ChatGPT now has access to Google Docs to read…
ytr_UgxPMmnHJ…
G
It's really a problem of population density, or the lack of in our cities. NYC f…
ytc_Ugz4Qttw8…
G
Just wait till ai gets used to expose the government and courts, …… ai gone…
ytc_Ugy8gxS2G…
G
I would rather have a society that values human life and ensures everyone has th…
ytr_UgxbD12nV…
G
@ I think AI is just derivative and not unique, less of a tool more of a glorifi…
ytr_UgwzmpRkq…
G
Yeah... the fact that we actually have an AI called SkyNet. Should be a warning.…
ytc_UgxNM4CDb…
G
Side note after almost finished with the video I agree that A.I. Will become se…
ytr_Ugwie12bT…
G
Remember I Robot when Will Smith was in his car and he took it out of autonomous…
ytc_Ugwc0gOfT…
Comment
These chatbots don't change "every day" they change every answer. The answers are a statistical representation of what the LLM was fed with. Bad answers are a part of the probabilistic nature of how these models work and even the reasoning models and fine tuning layers that are supposed to safeguard against these kinds of issues are not undefeatable. Ask a chat model the same thing often enough, press it to answer, reset history, try again - at some point it's just statistics that you'll get a bad answer.
youtube
AI Harm Incident
2025-11-26T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzw6BicBvBU0ugWzOZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzugAZ6HW25-np1bch4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyv2D2UF5OtXKVj8qx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyw3YgXXfvDo6HPoFh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0g9sXIIyhhA72fL54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGkkRf3vF0ul1fBsN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxr7Byh7sQZL9v5TXd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgywCy8ub5fK_EmFdwx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyggMFgeVl9c-hbatB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWkF00-fbauSOBbVZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]