Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Wow, what a completely dishonest hit piece. Like can't even represent what his "expert" said about failure rates. 3 out of every 100 situations it gets wrong then he jumps to saying 3 out of every 100 FSD ends in a crash with his airplane comparison. That's a huge jump and not even remotely what she said. Also, that was about as vague a statement as she could make. What situations? What does it get wrong? How does that compare to human drivers? Basically it could just make a wrong turn and that would be a wrong decision. Then if we compare to humans, is 97% an improvement over human drivers? The other thing his "expert" doesn't address is the need for decision making or why humans are good enough to drive without lidar and radar implanted. Waymo can have all the inputs on the planet but if it's not smart enough to make real world decisions about that data then all those inputs are useless. What a waste of time to watch and to respond. I just hope someone reads this and sees through the BS in this video. Not saying FSD is perfect, but if it's better than humans then it's an improvement and a step in the right direction.
youtube 2026-04-03T12:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw6CamZip73uj3w1FZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxWWUC1jIbu6-ggbet4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugwslqgi7fU2vtW1cXV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwzXu-IbqOx8EnyA1d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxItVi0wv8yipQpyBd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwLMa-o_ufE2RlCPBt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw2xb40ncFvrgaPD6p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgylL4hXJYdqnyKeeg54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzCh89OdYMwf2U5Zex4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxCzZaBgd3KbSnpXy14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]