Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As a Tesla owner, it's also our responsibility to pay attention to what's happening in front of us. This system isn't perfected and Tesla has made that known as FSD BETA. Beta testing means that a system has enough features to do a limited distribution and TEST for bugs. It's up to the drivers to not hit anything or anyone. Irresponsible drivers should be held accountable. If my car got within 4 car lengths of a motorcycle on a highway I'm taking control. I don't like to be behind them regardless, but I'm sure not going to let my vehicle continue to close that distance to an uncomfortable level either before changing lanes or slowing down. I don't think that FSD will be perfected till 2035, maybe slightly sooner, but there's SO MUCH that the AI doesn't know that a human driver does. I think that FSD hasn't been misleading, because as a vehicle feature, the hardware will support the software upgrades and the software HAS been getting continuous updates. It's PEOPLE who aren't reading the big screen that specifically says "PAY ATTENTION". I do agree though that the removal of the radar was more about cost, as airplanes blend sensor data seamlessly, but they do also have redundant computers and they duke it out when they disagree as well. He's not lying about that with the "which sensor is correct" and the computer having to choose one. I truly am gutted by the fact that these riders were struck by a vehicle and lost their lives. It's a tragedy. I do think that Tesla still has a lot of work to do to make the system perfectly safe, but after having been a driver for 35 years and been involved in 3 accidents not caused by me, and having witnessed countless others, we don't live in a perfectly safe world as it is. I agree that holding them to a high standard is imperative though.
youtube AI Harm Incident 2022-09-19T20:4…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwRAc1cpVtv3KqMvrR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugw0QPN7rCVGJkXyXTV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxAophKuEdZu96R_tR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugzs54X6AL70KsgkbsV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugwxa-eI271z4EOd_8x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzMeZ2kJodXqAm5BM14AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzJNasqhuGRVyKdgn54AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxRUgAoXueYjt4oUit4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw86r_DMDrD6uDOCRx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw8Ei10EazxKqe_xAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]