Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Smart to blame AI. What US did with its fiat currency, fed debt, financializatio…
ytc_UgxjVoAzt…
G
@halnineooo136
Yes, but planes crashed first, rules were made as a response. Th…
ytr_UgyO6Ytj4…
G
Folks: AI and CRYPTOCURRENCY will be used in the mark of the beast system. The A…
ytc_UgyBuDW5T…
G
How can he mention anything about the morals and simulation ability of the super…
ytc_UgzxbEaBS…
G
LMAO you'll never make an AI lineman, plumber, construction worker, any of that.…
ytc_UgwhPdyoe…
G
Before you go to Claude. I have a pro subscription and got shut out of the servi…
rdc_o91osdz
G
Have you use Chatgpt 4.whatever ? You know this has to be staged. It is so ridic…
ytc_UgytD9AgH…
G
Kurzgergat going to destroy those AI slops videos, now what happens if 2 hiperno…
ytc_Ugx7GZbPE…
Comment
There's a big difference between autopilot and FSD Supervised. The 2019 incident in Florida couldn't have been caused by FSD because that wasn't available at the time. Autopilot is a glorified cruise control.
It was a very tragic event for the victims that would've been prevented if the driver was paying attention and not looking for his phone while simultaneously pressing the accelerator. The judgment was correct in that Tesla and the driver shared fault. Tesla's marketing might've been misleading. The driver was irresponsible at best and negligent at worse. Someone died and another has life-long suffering.
I've done 4500 miles in my 2025 Model 3 with some alarming flaws. I've had to take control a few times just in October. There's a lot of road construction in my area and the FSD has seemingly been baffled by that.
I've had to take control when my Model 3 was headed directly to a closed off express lane entrance with the orange and white arms down with orange flashing lights.
Weeks after purchasing I was the first car in lane two at an intersection waiting for the light to turn green. FSD was enabled. Inexplicably the car accelerated from the red light as soon as the left turn light turned green for both sides of traffic. I had to brake to avoid going further into the intersection which would've either hit the incoming left turn traffic or would've caused one or more of those drivers to make an evasive maneuver possibly causing an accident. I'm sure people thought I was on a cocktail of drugs or simply stupid.
I now have the 14.1.4 update which seems to be improved from the previous version, but even with the update the FSD has dumb moments that reminds me that I can never be complacent with FSD activated.
One example from this past weekend: my right turn was coming in less than half a mile, but FSD took me from lane three to lane two. The was a truck in lane two moving slower than the cars in lane three. I had to take over and correct course back to lane three in order to not miss my right turn. Not dangerous at all, but an inconvenience and irritating.
FSD is not flawless- Even if one person has one incident. I'm sure in a few years we will be driving fully autonomous with very rare incidents. We aren't there yet though.
Plus, people with older cars (or "hardware" as it's referred to by the company) will not get the same FSD software updates as those with hardware three.
I believe Elon should drop the "cameras are the only thing that's needed" philosophy. Add LIDAR to supplement cameras. Yes the price will increase a bit. That's better than continuous litigation, multiple hundred million dollar court judgements, appeals, and settlements.
Finally, the public should realize there's not millions of Teslas driving themselves at this moment. FSD is currently a $8,000 option or $100/month subscription. I promise you the adoption rate is minimal relative to how many Teslas are on the road. A lot of owners aren't even paying for the $10/month premium connectivity feature.
youtube
AI Harm Incident
2025-11-04T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwi0dbbM6Cib4b46I14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBV2BcWusSkzJmebF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwaWkhMePbEdKhp6hx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwOk00_a9YTJoK0SpF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwtA6aHdXzh62w-YP14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-rE8ig63Na_MnnlR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwjeVqeoZW7VbrTFod4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgyH_V8KgoftUjWMXVN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzw8EPeKdNzaCLv2aV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwW74b8V5oseIxmVQ14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]