Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
People still inhabit the vehicle. Why are they not following the standards of autonomy by always being prepared to take control? Several tens of seconds of reaction time as stated in the video, should be well enough for any capable human being to correct. AI is an assistant at this time, not the control. TY for video
youtube AI Harm Incident 2022-09-03T16:0…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugzscw7uPGAor99i2C14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugy0E59TWqsqDEriYUp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyzfVpiqfHF7QbAMLp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxJpWqtSLFqMZq8PNJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugx_fxscvu0VyqZfv3N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwUjVhQHb-8evTYaNp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwtIKhESZllyF6TfTl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugy5fJqk1enVhxdL5Bx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxv_xUEwGN5xBAKjm14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwMdsiNYJZxs8BHXs14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]