Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's because people like to think they will be just fine if they don't drive drunk, but there's nothing they can do about software bugs. Even if, in reality, even a sober driver is statistically safer in a self-driving car, it won't feel that way until the difference is really big.
youtube 2023-08-10T13:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxAcV3-jeGRD8Ee6zx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwnChjmSX_yHITtIzR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwfQrExD2D6_UQHyEp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugykx2wAY5dREF-SFFJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxVTuSUCocmKajIjtN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugwx1y44FZI776ewm9t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgwXrz0slgBdwc2zw1l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyFCnmOfLiYdmU-wYF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwaemTH8eWUccvEhjt4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugyt4Mx2dB7uiJ5TBdV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]