Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
People keep saying "oh the driver should have known" or a "better driver would have prevented that", but that's exactly the point. You can't just expect everyone to be able to react perfectly at all times - exactly why these incidents happen. That's part of human error and what AI is there to improve.
youtube 2021-12-28T13:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzqzqZCuaxc2Xi3Rhd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw7VkeZfk6iMyLHMdx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzYyo3j8GDhqvKlEVR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxexkQg6tZdTYpmKgd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxC0uBDLUrtMBMyFZx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw_CFZRHmgDJIkCXKR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzix9MI7_81v0ODBxh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzEuAJQ1XsUCCnexfp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxdZWdMNE5pYggbp3x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxN98Cl05jycq5Zf1l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"} ]