Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Issue: If I have a self-driving car, then it better prioritise my survival over everyone else, because I am the passenger and I own the vehicle. The problem is that I would rather have the software to prioritise my safety above anyone else, disregarding the wellbeing of others, if required. If the software calculates that it would rather sacrifice my life to save others, then I am not interested in such a terrible product.
youtube 2023-08-04T14:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzWBHmbGK_KD-YLgqF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwrs67iZ0M6DtHVhJx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxDz1alild_uCkEtX94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzBmLU9l4aPDLCG9W54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}, {"id":"ytc_UgyMmeiKSs3nDpuFDxB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxe0SkbASwCSYMLGMd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwWbKVVEPa_LBJ1XcZ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzKFHCvf2mLhQDsZP94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwKIeS8CV8W7htXuIR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugx_55O1pq-KarsAbtR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]