Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well Aroura is just like any other company that currently depends on current day…
ytc_Ugz8Pa8ai…
G
Unfortunately, people need to keep learning new skills even in middle age or lat…
ytc_UgyznEMuY…
G
I have a love/hate relationship with ai I think it it were trained on data espec…
ytc_Ugxr10jRP…
G
I never understand when they say winning the race, what does that mean? What doe…
ytc_UgwoxwDfW…
G
Nah man. In Red Alert I would just turtle up and then launch like 5 nukes at a t…
rdc_kp0o5v4
G
50 does not come in between instead it could be 49
hence proved ai wrong…
ytc_UgxiX1wLJ…
G
Everything started at big groceries companies...auto-service / fast pay, people …
ytc_Ugy0QRAIJ…
G
I disagree w/the notion that the 1950's style skits that ppl are doing directing…
ytc_UgyRwLmZU…
Comment
No, this just shows while FSD is trash and why no level 2 self-driving system should be legal in the U.S. It wrongly gives many people the impression that they can rely on it, when they can't. It bizarrely requires that the driver take over a fraction of second before what might be a huge collision.
youtube
AI Harm Incident
2025-06-10T22:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwjXI1rHpESCiXK5Dx4AaABAg.ALmN6DMz2pbAOWPA6q1Bpf","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugxv4NdIYqfIre31OU94AaABAg.AL3LqvIqUnkAM8Gc-EFsbs","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgymGiHMGAoN2aoozYZ4AaABAg.AJ7RjwAjz2SAJCj4-jeXSS","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgyKz07OTJK4yRO5tm94AaABAg.AIXfT4YQ1LlALMGpRN_UiN","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzPbkVjZeJTfkOUNw94AaABAg.AHuCyb-cbdxAHuQL4RgIsZ","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugzw4LG7x8tPE9myfRl4AaABAg.9s9XjgAZz-K9sCRPpLFTkk","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwSRZaDKJFXaSf9vxJ4AaABAg.9s9KP3IKBNS9s9bc1Huxf9","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzbaxwQ1qRf14YRnNx4AaABAg.9s9H3OeypwM9sCQ8uJ_Rw9","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgynM0NJ_DH7nDss-854AaABAg.9s9FYsIn11S9s9XLpGqho0","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgynM0NJ_DH7nDss-854AaABAg.9s9FYsIn11S9s9_cGTeRQg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"resignation"}
]