Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
shame on you WSJ for such a sensationalist piece. you fail to ask ANY of the relevant questions, while pulling our heart strings about dead fathers and supposed corporate greed. The question was never "will a Tesla be 100% safe", this is a ludicrous question and the answer is obvious: no, there will always be Tesla crashes, due to bugs, hardware failures and hard to model circumstances like overturned trucks. The claim/dream is "safer than human", because it doesn't have road rage, doesn't fall asleep, doesn't try a risky overtake to save 10 seconds, doesn't drive drunk, doesn't look at their phone, doesn't argue with their wife, doesn't say to their buddies lets find out what this car can do. Every one of those behaviours are much much MUCH more common than overturned trucks. We are going to lose people we love due to Tesla autopilot, but we are losing more people we love if we continue to trust human drivers. Stop spreading fear of technology and pretending you are the cautious one. This is cheap and disgusting. After 9-11 hundreds of thousands of people took the 'cautious' decision to drive instead of fly, and literally thousands of people died, because driving is that much more dangerous than flying. Help us move forward, help us grow as a society. Stop actively holding us back for clicks. There are relevant questions we should be asking, and I beg that you start asking them
youtube AI Harm Incident 2024-12-22T09:2… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyFO-9eC5zlG_hNGah4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy6yOKkwiyREQn9Tcl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgweJGnMI6_EBDZwdvp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugzr6JIuyym3nDLWXFF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwXXQuF4QbXVXAx_Ah4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwmE1kGedyjXd6UD0l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy-2IfQ7P4otztnO454AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugzoyb7wAsOqXml2Ffx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxZCHcUzzyd5UhZgIB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy7-6-aXZv1BjShoNV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]