Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My gut/intuition/spirit/soul…whatever you want to call it says this is going to …
ytc_UgwpKxJWY…
G
so if ai is taking jobs whos going to buy the products they are making?…
ytc_Ugz5HlFqr…
G
You cannot compare former periods of automation ( industrial revolution, comput…
ytc_Ugz4LhFvA…
G
One thing I really hate is the odd time I stumble across a piece of AI "Art" tha…
ytc_Ugx63vtd2…
G
@alansmithee419 I agree that if the AI systems have the right goal, then there …
ytr_UgxRGLCxC…
G
@RichardBaran The biggest flaws of the free Tesla autopilot system are it does n…
ytr_Ugz6hgp_S…
G
Honestly fuck AI and even though I don't wanna put in the effort or training I n…
ytc_Ugxb4xJM9…
G
One of the big problems with LLMs is that they are idealogically biased. Just th…
ytc_Ugxw8IeGl…
Comment
It's definitely not the child or AI's fault..it was him who had already decided to do that to his life by then..gpt was just a last option to have a sort of communication on how he feels when he might have felt the absence of understanding from his own parents..he must've had his own reasons
youtube
AI Harm Incident
2025-09-01T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyc4aTwAw4dS98Hr-l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyPFrTZcp9icqPuuoN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyFCcgwU2PC1pEgt894AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwd8oKC6LXJKR7ZCtN4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"sadness"},
{"id":"ytc_UgzK4T35GWw_aXuH4nV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxnGpi0qNlq9lRZFF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"sadness"},
{"id":"ytc_UgwPgTBKq17sB0GELFV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwk7FzhASNMfAIsqF14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwTft5pYB5mvBwIhXF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzJwoIMdL2--S9-h2N4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"sadness"}
]