Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If we think only on these scenario, the algorithym controlling the vehicle will be in serious trouble, but we need to see these on differents perspectives. I think we need to understand that we are not perfect and we can not create a perfect algorithym or a sum of algorithyms that prevent all scenarios. I think we should work in various aspects such as driving lesson, practical test to get drivers licenses and laws to produce vehicles with safety measures. The statistics among many countries show which are the primary causes of car accidents, these can help to programm the algorithyms by focussing on those. I also think that every live matters but one of the reasons of car accidents is the huge number of vehicles on the streets, what i'm saying is that we are yet far from having a year without car accidents with or without self-driving cars.
youtube AI Harm Incident 2023-11-25T07:1… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyindustry_self
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgzyoQHfkvKymBmesal4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxjDv4Z1CBjO3WJHB94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyTShSLnJ9cwL-Lbw54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugw2KRypsW4jIcRnBj14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw5i-H8AgNVVhk3L5Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyLQD8OSQSPK_gIU7Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxcoaaNEw2BPDxcR8B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_Ugy1OKJlc-JkmZ4FchN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzCOQgM8jWOa4MaGkB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxcp38aM6btPES_LwF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]