Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
10-year Tesla S driver here. I like Autopilot and use it frequently, including on non-highways. I'm well aware of its limitations. But I'm an electrical/computer engineer and I'm sure I know a lot more about the technology and its limitations than the average driver. To me, or to an airplane pilot, "Autopilot" is analogous to a simple airplane autopilot -- good only for cruising. But I can easily see how it could falsely imply much more to the average person, and that's dangerous. "Full Self Driving" (which I don't have) is even more egregious; it's simply a lie. One point I haven't seen yet. Tesla Autopilot monitors driver attention by sensing occasional small torques on the steering wheel. This is very fatiguing on long drives, and I get frequent false "pay attention!" alerts even when my hands are on the wheel, especially on a long straight highway like I-5 in central California. Occasionally it disables Autopilot until the next drive, forcing me to get off the road, turn the car off and on again. Very annoying, fatiguing, and potentially unsafe; it prompts some drivers (not me) to find ways to defeat it entirely. I *do* want an attention-sensing feature, but one that works well. In the Florida accident, the driver was apparently overriding the autopilot by pushing the accelerator. Yes, this is how the car behaves. It flashes a small warning on the dash but does not disengage the autopilot. It probably should. But he overrode the autopilot speed with the pedal only because the autopilot wouldn't let him set the speed he wanted; it was above the limit. So a feature meant to improve safety actually contributed to the accident. Even if the car had a radar (mine does, I don't know about his) you *can't* rely on it to avoid hitting an obstacle at highway speeds. The radar has limited range, just like a headlight, and highway speeds *will* outdrive it. Also, the radar beams are fairly broad so the autopilot is constantly having to ignore returns from roadside signs, guard rails, parked vehicles and overpasses. It's supposed to use the camera to help distinguish objects. But at high speed, on a curve, or at night it can mistakenly ignore a stopped car *in* your path. (A *moving* vehicle in front of you is easier to detect because the radar can sense relative motion. If it's moving, it's probably on the road.) This almost certainly caused those rear end collisions with emergency vehicles. The bottom line is that even with the autopilot on you *must* keep scanning the road ahead. The radar is great at lower speeds, especially in stop-and-go freeway traffic where it really reduces the fatigue. There are more things Tesla could do to improve safety. The car has GPS navigation. In Florida it knew that it was approaching the T so it could have slowed down or at least sounded the (loud) collision alert warning. But as far as I can tell, the autopilot makes little or no use of the GPS. It doesn't even know the speed limit until it sees a sign, even on a road I've driven before. Tesla could do much better even with the hardware they have; it just takes more software work.
youtube AI Harm Incident 2025-09-30T14:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx2Mv8TukwB_VeXp5F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwh4Fu7UDRAU7pAi3F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugz1l6tgW8rZ2O6IvNR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxgJ8QZX3PfSZD4tah4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw7G17Z3BZ8dJT_syl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwRWiRPSeOxdyV-BY94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugwflz_W5JsX-kJlGa94AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugwy8FJEbVVI2EFEf594AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgzvndmsEWPB0d9uFaF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_UgzxH7EpIvK13GD55Jl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"} ]