Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's important to keep things in perspective. While you watched this video there is a high probability someone died in a traffic accident, 99 deaths a day on average or roughly 1 every 15 minutes in the US. It is so incredibly easy to let perfect be the enemy of good when tragedy sells so well, the old fashioned routine road deaths are boring to us, the shiny new deaths that can be attributed to technology grab our attention and turn our stomachs, even if that technology massively increases overall safety. Emotionally it may feel right, but the argument is no better than asserting that you shouldn't wear a seat belt because you might get stuck a burning wreck or one sinking to the bottom of a river. What is suspiciously missing from all the anti Tesla hot takes recently is a direct comparison of autopilot/full self driving miles to other similar highway miles and getting a real picture of what the technology accomplishes. Nor do they ever acknowledge that legally and by the terms of service the driver is still responsible for control of the vehicle. Harping on the "misleading" marketing term of autopilot is no more valid than the ones against beyond meat products using terms like "beef", "chicken" and "pork" to describe their vegetarian options. While two similar accidents is a clear safety signal that warrants investigations by Tesla and regulators, it is not nearly enough to draw conclusions from and scare monger over. It's especially silly when we could actually look at several years of driving data to make the point objectively.
youtube AI Harm Incident 2022-09-04T00:0… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugwcf4NlnAHXoRkjNF14AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugw3P1Ouo082rMvsRhV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwvGNU7RFJHZr1LBa14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwSEpzixR-AqVhUtvJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwG02JA0m58vKk2lel4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgygjL51h7veFMTS7ex4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgyoRF0fDm4zZ9ZchLt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzWcsQ541sgdzjPDeN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyEcKmV1JlJ0qMkmRN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzD7w9yJ7CZJ5uMVZl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]