Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There are public records available that make it easy to assess. Tesla is 13 times unsafer then a human driver. Not even close to equally safe and miles away from safer. Autonomous driving is only interesting when it works. All current data show, it doesn't. That is the point we are at. Because when it does not work, it is a desaster. We are currently facing a desaster and are expected to just accept it. That should be the narrative, not some sort of future vision. Not some marginally possible utopia, which all data indicates is impossible with current technology. We should judge the tech solely on its current state and near future developments and all of that points to DESASTER. Currently we are promised working tech 2 years, 5 years from now, but the current tech is a menace sharing the roads with you. This no way to introduce or develop tech. You don't go ahead and just factor in the inevitable death of dozens maybe hundreds, just for a slim chance to make this burning pile of a dogshit tech stack work at some point.
youtube 2026-02-18T11:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxCTczrF5_qG1UUpDV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzYPogieASQSzrl2ph4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugygc3uEdJ-Q5Y5dhXl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxtyonzmXa5B4InHBt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzD7E8oNfnyM88rAj14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzj_C2Xy6ktxcCW6AV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxJMzlSrGOZBk96mfN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzSv52XjbclNkHhgyl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw3mrf9OIXdGjryaZ54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyTjAGRcXrUJB3m_6J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"} ]