Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Lets say we agree that autonomous car will be a likely thing happening in near future. Will it be govern by a single authorization body for the whole world? I would say it is not likely for geopolitical and geographical restrictions so I would foresee a combination of algorithm favoring different scenarios as discuss above. Would it be a single solution out of this dilemma? most probably not. Autonomous car companies will more likely have their own version of the algorithm for this and would vary from one to another. So lets give the buyer of the car to decide what and how do they want their car be. Until one day when every single vehicle on earth is fully autonomous, then there will be less likely this scenario will happen, while if this really happened the other autonomous car can also instantly react by giving more way for them to escape the incoming impact.
youtube AI Harm Incident 2017-01-13T16:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyindustry_self
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgiDRHNP6Ll3F3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgiIYchWvUGckHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjiR5ifVgu5L3gCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgiJaxBMly9MvXgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UghSuhCsL9iAHXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghAA7dcebmab3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UghAsDUNhcPf4XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugi_4HU5JSF7SngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugiufh1PTT6cmXgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UghXJhtXibHvFXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]