Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The big concern I still have about driverless vehicles on our roads is that they're an open system. Anything can enter or leave unpredictably. I've studied computer programming and know how impossible it is to predict every situation a machine might face, and program in a suitable response. That's why in the most advanced autopilot systems in the world, those that exist for airplanes, will automatically cut to manual flight whenever anything unusual occurs. When the unusual happens, it's something the self driving tech programmers and designers can't predict or preprogram responses for, by definition, so you need a human intelligence capable of rapid problem solving and creating and attempting unique solutions to unique problems. Our roads are less predictable then an airplane in the sky, and emergencies will develop much faster then with an airplane in the sky, which typically has a much longer problem solving window for the pilots, who are also among the most regulated and highly trained professions in the world. Our streets require much less training, yet have much greater complexity in terms of the array of things that can happen that nobody could have ever predicted.
youtube AI Jobs 2025-06-13T18:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxlZv2IzRrlNAtlCwt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwMO_KE5x2G8Fk1XZR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_Ugw_v7KGriE-MRnT2l94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwUzbHrlH7K4DxD8HJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz2P0XapfXX4tH-r_l4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxxdQG7dfrUxnqm1dh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxZ2shTIQnU5upRNLR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx_eFA5UKbYK8C-ahJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugznfq_qScpi4P7oItp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwp9XhcZvWcG1U38aR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]