Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There will not be a perfect system only a better one. Accidents will happen inevitable. An instinct reaction is not an ethical one. I will try to save myself in a split second  not knowing who is beside me or behind me killing somebody else. How about a save self driving car has enough seconds of distance behind the vehicle in front so it would stop in a adequate time manner.
youtube AI Harm Incident 2015-12-11T02:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugjwkh7gbtadm3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UghlKJ8Nc_INgHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UghjkWiCvWeo1ngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UghilDXtRwfSCngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgjN2KgJTlwlC3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugh54ZJdEXZfwngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgjWA5kpI1F_UHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgjB5N2AWV6PlHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugh2zj0x13RnS3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgiaFVeDpzC9U3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]