Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
WSJ, I honestly want to know if you’re investigating all of the humans killing other humans by driving cars at the same rate that you’re investigating self-driving car fatalities. Why are you not focusing a story on how we can reduce vehicle fatalities to a fraction of a fraction of its current standing if we were to implement universal self-driving vehicles and a complete ban on human-operated motoring. Maybe you should investigate just how reasonably achievable this is in today’s society and with our current technology, and just how many billions of dollars this could save us if not one more human has to die because we let another flawed human behind a wheel? I’ll leave you with my PSA I leave on all DUI arrest videos: #WSJ #WallStreetJournal Humans should be prohibited from ever operating transit automobiles. Computers can do it for us today. We could do this tomorrow if we as a society truly cared about human life. There is no excuse. Only WE can stop drunk driving. There is no excuse for DUIs, and the harsh reality is that you, me, everyone is to blame for every single DUI. You allow humans to get behind a wheel.
youtube AI Harm Incident 2023-09-15T15:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyregulate
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzDmyYGb04_fl5Vhx14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgwPan9_yS9N0kZ3Ln94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzD7Z_RuEo0hqILsQ14AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugy0Vd-5pqYE53uy_Ud4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugy8QI_p5qcZnihxq5Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugw_2mObqmhfhrkC-5h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgyHjykj2neMsIYtQpN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzo5urqv9tcQ9rn5q14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugz-r2lvVjWBCFXTVr94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugzycewy1LUka5BbmMp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]