Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The problem is that no amount of safety will ever be good enough for people but only when they cannot be held accountable. FSD is already shown to be nine times safer than a human driver in accidents per miles driven. When that number becomes 100x safer it still will not be enough. Humans can kill 50,000 people driving manually and nobody bats an eye, if Tesla reduced that from 30k to only one person, they would be sued into Oblivion and YouTubers looking for clicks like this person would be saying, "well you could use lidar, you could use radar, you could use CO2 detectors, you could use infrared, you could use you could use you could use you could use." Nothing will be good enough which is why we will not have self-driving cars until they have some sort of government protection stemmed from everyday Americans wanting to see 30,000 lives a year saved understanding that there still will be some life lost because NO system will ever be perfect.
youtube 2026-04-09T00:5…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugy9tdBbB9hrFjKYKmV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzlKLZNeTJmLUULum54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyoEaRVuI0jq5zye7F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx2ZUtG539KTnT_nrN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyACA_tP1esiLUe6mx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw28IIUnhqMgH8l6Jd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw5piqVZRMOW_40pWl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugy_6aFmUbFEvAQ33up4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyACopbj6pnHeKvo5l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzlEYEuPjH6G9ATpcl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}]