Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​​​@stuartd9741 This isn't a matter of hypothetical opinion. We have data. It's simply a matter of measuring number of miles per collision. Or in a level 2 scenario, number of miles per intervention (i.e. if the human doesn't intervene in a drive, then that drive could have been level 4). It's not guesswork. Currently, automation has a better safety record than human drivers. For many automation systems, one could argue that the number of miles is too small for statistical significance. Tesla has by far the largest amount of data, with 150 million miles on FSD from October 2020 to March 2023 (source: Tesla Q1 2023 Update). The data clearly show that FSD improves safety. Interpretation of the data is nontrivial, but Tesla an NHTSA are able to use the data to guage policy, such as permitting level 3 operation. The least complicated data point is collisions per million miles on vehicles equipped with FSD. This includes miles driven by a human driver, so objections pertaining to selection bias don't apply. Teslas equipped with FSD have an accident rate of 0.31 accidents per 1 million miles, compared to national average of 1.53. The rate for miles driven with FSD turned on is quite a bit lower, but subject to selection bias because drivers tend to select safer conditions to enable FSD. Also, FSD gets safer on each over the air update, so the current safety is even better.
youtube 2023-05-31T16:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgyJTiSXsv2HxalX1Ld4AaABAg.9qLVnqQvkX29qLuH_UBKBx","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgyJTiSXsv2HxalX1Ld4AaABAg.9qLVnqQvkX29qM7yl9admv","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytr_Ugxm7XNzKocMoy8ioWB4AaABAg.9qLUVuCCcx29qNBysve-Y7","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzqIwROMLlXrXOxkYB4AaABAg.9qLS5fQCWz39qMOQ_YeL1G","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytr_UgzqIwROMLlXrXOxkYB4AaABAg.9qLS5fQCWz39qTtRsOIW2z","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytr_Ugx0yEpS-DkEw_iu8Vx4AaABAg.9qLLqXKctLg9qLRQ1V4nYO","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugx0yEpS-DkEw_iu8Vx4AaABAg.9qLLqXKctLg9qMQb0beZIU","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxzmVvO8JsmqM6IYi94AaABAg.9qLJ7bNJFFh9qLSVhAtjAG","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxzmVvO8JsmqM6IYi94AaABAg.9qLJ7bNJFFh9qLvhNXAGeF","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxzmVvO8JsmqM6IYi94AaABAg.9qLJ7bNJFFh9qO3emJHI2K","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]