Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
_it has to be AI all the way down_ There is no AI in Tesla Autopilot; it's 100% implemented by heuristic coding (hand-coding by Tesla engineers). _if you don't know what's in front, assume the worst._ The claim that Autopilot didn't stop because it didn't recognize the vehicle, which is one of the more egregious falsehoods in the WSJ hit piece, is what I like to call *FALSE.* It didn't stop simply because TACC from all companies doesn't deal well with stationary objects in the driving lane. Here's Tesla's warning on the subject: *"WARNING: Autosteer* is not designed to, and will not, steer around objects partially in a driving lane and in some cases, *may not stop for objects that are completely blocking the driving lane."* And lest you claim this is just Tesla protecting itself, this is from the Owner's Manual for a 2023 Chevy Bolt EUV: *"Warning:* ACC may not detect and react to stopped or slow-moving vehicles ahead of you. *For example, the system may not brake for a vehicle it has never detected moving."* This is exactly the situation in the first anecdote. The Tesla came upon a crashed semi-truck it had never detected moving; and in line with the warnings from Tesla, Chevy, and likely everyone else with TACC or an Autopilot-like function, it failed to stop. Not because it didn't recognize the overturned truck—it didn't need to do so in order to stop— but because TACC systems in general may not stop for stationary objects in their lane. The driver's failure to heed this warning is not evidence of a failure of Autopilot; in fact, NHTSA has investigated several such accidents, and found in each case that Autopilot operated as designed and intended.
youtube AI Harm Incident 2025-01-05T22:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgzfUS0xtiBfuFvfCmB4AaABAg.AD7SBJJ_dIkADwxsQ8WC5f","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytr_UgweVnr66i0YqUdRuzR4AaABAg.AD0vAaKzzeSAD7YcDmBR5N","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytr_UgxURpG2BxlDiPhda_F4AaABAg.ACwUdEX21NxACwVQlWlV-B","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxURpG2BxlDiPhda_F4AaABAg.ACwUdEX21NxACwr20fZ4Ng","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugwzp58DTN1IBX_8ERB4AaABAg.ACw-TLF_wW2ACwVlsbWevm","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugwzp58DTN1IBX_8ERB4AaABAg.ACw-TLF_wW2ACx7dccGkCS","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgxGdm19dqdWYCXBTRh4AaABAg.ACv87E-W_HcACvwaeRMegZ","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxjZse9UwquSPXxBa94AaABAg.ACu1wJvl8CpACuTbc5UP1I","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzkI3oblNzQq9goUhl4AaABAg.ACsco95dWTEACsxtfyLlyp","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytr_UgzgJcQ1Wv6HfExc3c54AaABAg.ACqYngCWwk5ACr-o_BUue2","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"} ]