Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i love how he articulates himself and how he alerts us that western propaganda …
ytc_Ugyf0JlUN…
G
Hi, you got the wrong answer.
The contest is over and winners have been announce…
ytr_Ugxlomr29…
G
The thing about selfdriving cars being safer and making less trafic only applies…
ytc_Ugy-o9Zig…
G
We will learn what AI is when it becomes sentient and decides man is the problem…
ytc_UgyckCiGv…
G
It could work for something like coco melon but to have a movie entirely made of…
ytc_UgzxSC6cO…
G
Don't forget about Getty Images vs. Stability AI. Finally some laws will be deci…
ytr_UgzR-qLV6…
G
I’m tempted to say something about the whole AI thing but teachers are clearly t…
ytc_UgxuRz5QN…
G
They are putting a data center near the city I live in. The electric and water b…
ytc_UgxdDwcjT…
Comment
Tesla has never claimed that autopilot or FSD can prevent all accidents. That's obviously absurd. Also every major manufacturer has vehicles that offer capabilities similar to autopilot.
Fact is, the driver is 100% at fault and the court finding was wrong. The driver held down the accelerator, the car should reasonably be expected to accelerate, not argue with him about whether that's safe to do.
I do agree, however, that Tesla should stop calling this feature autopilot because it isn't - it's basically adaptive cruise control and lane keeping. Ford has BlueCruise, Chevy has SuperCruise - these are all partial automation systems similar to Tesla's Autopilot which are currently being used on public roads.
youtube
AI Harm Incident
2025-08-16T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw0F3XLEUsfyFOVobN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyZi3YnyHLC_KGKT-d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzb1C-4nFtWBrui0JJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxr1fXN1qD5Aw68emh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8qZAjzVbaFYzapzl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycVmgrDs7TRXJF90R4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy8K9nkAsuSO_2tOCt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1FwFF1iyjqD3NyZ54AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy_QWat5FfUuvuMAyZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyTv1eVoVxFtR3zybd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"})