Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don't think it's as simple as that. If someone falls (or throws themselves) in front of a car, a bus or a train, do they have the right of way? Not saying this is it, but it's awfully close. There's no point being afraid of obviously a far superior solution than letting us humans drive, simply because you can't (yet) solve some of the edge cases, which aren't really solved even without this tech solution. And not to mention that if left to choose between autonomous driving and humans driving, choosing humans would mean choosing more deaths, for certain. Also, with every passing moment, this tech will get better and better, and when all cars will be autonomous, I think the likelihood of someone getting killed will be lowest, but people will still die on streets, unfortunately. How can't people see this, I'd say is beyond me, but then, I remember how even though the airplanes are the safest transportation method today, you'll see nervous individuals smoke cigarettes before flight, fearing of the crash, when the likelihood of that happening is infinitely smaller than the certainty of all those cigarettes not only shortening, but making their life worse.
youtube 2018-03-21T00:3… ♥ 6
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_Ugw-wvkKZc9KIwiNtj14AaABAg.8e1mUHM8HBe8e2Dld8YSR8","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxNSoZT7mB4mHEBUfJ4AaABAg.8e1gkiWiWJD8e1vaOKA7jw","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugz5dBIVBv9sS55rQt54AaABAg.8e1cQlxEJlU8e1f-gRY4nP","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytr_Ugz5dBIVBv9sS55rQt54AaABAg.8e1cQlxEJlU8e1jRVBNEtQ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugz5dBIVBv9sS55rQt54AaABAg.8e1cQlxEJlU8e1nOKZl3us","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyD1b6mLwQFFskEhzd4AaABAg.8e1cBq7EEz88e1nq_0j6Zm","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyD1b6mLwQFFskEhzd4AaABAg.8e1cBq7EEz88e24J28nNI9","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugxb00GdocSAr288C6J4AaABAg.8e1cAXsVQ5T8e1gK5lR_tJ","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzpHpTcsqDjgsCglxN4AaABAg.8e1bW4vmhAg8e1tKDxT-C6","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugyyf3r8R8WFb0kWMI54AaABAg.8e1bTumnqMv8e4GT2q8K9i","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"} ]