Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
that is not how it works lol. Putting something in public doesn't mean anyone ca…
ytr_UgzWd8pd3…
G
You know for a high end robot you think they give her a little less of a recedin…
ytc_Ugwl7DhFI…
G
AI is like "science" done wrong. If you allow biased humans to program it to ca…
ytc_UgzkeZWFi…
G
Literally the guy who promised that there would be no fatalities in their driver…
ytc_UgwgzTVtO…
G
Remember, Ai learns from our actions , and since 1 billion people use ChatGPT ev…
ytc_Ugx-U1IEl…
G
On facial recognition
Republicans concerned about limited freedom.
Democrats con…
ytc_UgxDP1xIL…
G
"AI is neither good nor bad. It is about how it is used" doesn't sound like a go…
ytc_UgwQDggEv…
G
Yes even though the people at the top of the skills hierarchy can beat what is e…
ytc_UgxjzjfI5…
Comment
I don't know if this figured into the case at all, but the video footage showing the S driving straight off the road and into the parked car at full speed, with no evidence of slowing down at all, shows not only that the Autopilot feature didn't live up to its hype, but the much more BASIC, FUNDAMENTAL feature of automatic collision avoidance ALSO failed. Tesla advertised the heck out of how safe its vehicles were with what they originally branded "TeslaVision" - 360-degree monitoring around the car, blind-spot detection, front and rear impact prediction, etc.. The very first use case for that tech is to automatically hit the brakes if the car detects that a crash is imminent, to either avoid or prevent the crash, or to minimize the damage. And this had been a standard safety feature on higher-end cars for at least a decade before McGhee's 2019 S was built.
The video clip shows no evidence that the car tried to slow down at all. Even if McGhee had been overriding the default speed control by holding down the pedal, the car should still have applied emergency braking, overriding McGhee's controls, because that's exactly what Tesla advertised in its suite of safety features. It more recently has said all of those things (collision avoidance, lane assist, traffic-aware cruise control, lane departure warnings, reading speed-limit signs, etc.) are part of its Autopilot suite (basic or enhanced) and are standard even if you don't buy the extra FSD package. The terminology was a little different in 2019 than now, but the distribution of features is pretty much the same.
So, even disregarding the fact that McGhee wasn't paying attention, he was overriding the speed control, and he was relying on FSD to behave in a way it wasn't designed, Tesla's other prominently advertised safety system completely failed. In fact, this has happened in a number of other cases, too - a Tesla car in self-driving mode famously crashed into a parked semi with a shiny metallic trailer because the cameras were fooled into thinking the road was clear in front of it.
Near where I live, a Tesla vehicle had FSD turned on and was in stopped traffic when it unexpectedly accelerated and ran over a motorcycle in front of it, killing the rider. In that case, the driver admitted to not paying attention, so Tesla never had to answer why the car thought it should slam on the accelerator when nobody around it was moving.
And now, Musk is insisting that self-driving cars should be able to rely solely on visible-light cameras, and Tesla is no longer installing radar sensors in current models. Unsurprisingly, their safety ratings have gone way down since making that decision.
youtube
AI Harm Incident
2025-08-20T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwYoH8BUWQJNUbcfjt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz2EZU9RNuRziAjnX54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzC350sEEiXQTBXjdN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwI8lQp29mHILyBN8d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzl_F01hE4nghXjmEx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxyjOF89fd51WvwZEN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8RZpce6dWScg_jFR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw-uJCgijbOOtdg0Fp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwTeD0eeCC6zT4B4yl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVtu70xpEPgD6OnpR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}
]