Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You said it falls for the exact same bias we fall for. Humans struggle with these questions too. AI's are not infallible. But neither are we.
youtube 2026-02-17T23:5…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgwMv4724Wovc175N7t4AaABAg.9mBDapI7Mt-9mCe8Ne6PbQ","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytr_UgzcEmww--hkmsTbWoF4AaABAg.9m9lmDuW82m9mTmQxgqcuy","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzcEmww--hkmsTbWoF4AaABAg.9m9lmDuW82m9me1xZqMowY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxDk5iqrcaoSKIyvnN4AaABAg.9m9QrUxUux49mKwq9FFndr","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzpbIjYxAifI97_xBN4AaABAg.ABV8iO6l2WZABVEmJrqfOW","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgzpbIjYxAifI97_xBN4AaABAg.ABV8iO6l2WZABVQN3Wi5NZ","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgxPLErCIEKUGChrmu94AaABAg.AOfqlECmNjeATLhin0DA3g","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyRq9_MI1dhs6DYvip4AaABAg.AMiqUm4HWvDARBEwrhFiBt","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyRq9_MI1dhs6DYvip4AaABAg.AMiqUm4HWvDARBN1VmWMfo","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugz7ToWhoSMv6kRE9194AaABAg.AJa0elDhqFHAJa2GU1gKZ2","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"} ]