Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
is that why we don't understand why AI never functions the way we design it to and how it draws its conclusions and literally nobody can explain why? black box issue is a very real issue.
youtube 2025-09-22T09:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_Ugye_ZlTo0NCSwjLlCZ4AaABAg.AIkKNpSNXSvAIkXGuLZYpF","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxHhE74iA0bIcQ4S0N4AaABAg.AIkJoZoMVglAIkTLJwiiBL","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgzDDhNJ-GSG_waxxlV4AaABAg.AQ_Cbvi2u_hAQjGBh7s5lq","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgylY7thrOkjaWaPKhZ4AaABAg.AM2ga0vqXmgANN3LpA6sqm","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UgxXrr3yjcJxKu2Cqql4AaABAg.AKdVBijpuxlANN3eCbLxBh","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UgxXrr3yjcJxKu2Cqql4AaABAg.AKdVBijpuxlANNL77e6Z2M","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyiE5k1_KWwgVzDMIh4AaABAg.AJKHpc1PN_ZANy3HBuBqlZ","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgyiE5k1_KWwgVzDMIh4AaABAg.AJKHpc1PN_ZAPYOT2Xhw16","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytr_UgycihpGSg5n2xKP2Ot4AaABAg.9zZfqnRBG3GAH1mKPKuZLD","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugxpt6JIq7o5_-oeWUx4AaABAg.9zS8bnhfgwZA2P0I17Fkf2","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]