Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@jonm7888 Well, perhaps not in real time in every model out there, but there cer…
ytr_Ugxlb4O4S…
G
Well... It seems like you’re using his tone as an exit ramp to avoid engaging wi…
ytr_Ugwn-uz2l…
G
The fat lazy people in Wall E we see are likely just the rich elites, most of th…
ytc_UgyESZAGW…
G
I wasn't born yesterday, was prepared to learn this digital age, learn codes wan…
ytc_UgwYVMbdb…
G
I do not see the huge tech companies creating a beast of products that removes t…
ytc_Ugy_ZF9aW…
G
I love it whenever something happens inside of a story and I can connect it back…
ytc_UgyjunvW7…
G
Ther needs to be a law that AI videos are visibly marked so people can identify …
ytc_UgyXsKerH…
G
Why the shots of Waffle House and a Stadium though? Is this video AI built?…
ytc_Ugx3pXiSq…
Comment
So autopilot accident in 2019 with technology that is several generations older is the lead part of the story? Autopilot is not auto drive, and Tesla has not said that their vehicles are fully autonomous. Consider the technology driver assist. If you left your car on cruise control and crashed who do you blame? This really feels like a hit piece. It would have been nice to see a more balanced story.
youtube
AI Harm Incident
2025-11-16T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxlDYVe1_sSxLh-xgB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxsN3Z8yyN46gupSQp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzJalLpBrpurPhgHOR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3NHGLwINB-zlR3Zx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRmpHthrFCTs57k0p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzrYcKqvmgjS_VTiFZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyiZXaOesPnRAPyx9x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9dPuV7BjC7yh5SrF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxx-lpM9HDEAr3YneZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzh8U50IzCywSadsFZ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]