Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I stopped at the 33 second mark. that isn't a real danger a self driving car will ever face. believe it or not, we have terrible reaction times to dangers in the road. the car would brake long before any human would, unless you were aware it was going to fall, in which case you would probably have slowed down ahead of time. braking is the only solution. that and a self driving car shouldn't allow itself to be boxed in to begin with. there's no fucking moral dilemma here, if less people die annually to car accidents because people have self driving cars, then it would be immoral to drive yourself instead of allowing the computer to do it.
youtube AI Harm Incident 2015-12-10T07:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgiJaQs6F28eWHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UggJ82QW9q6Yh3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghnlhSnEQZ0IngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggjXP4s7034gngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgjMw5uEv4uP13gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UggU7UUEmbYyYHgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjNuOWAcDkP3HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgisvA4COAatfngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgjTfq8djgy0rHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghuJ8ET5_X-j3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]