Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
My bet is 2035 idk why but 2027 just feels too close, also I don’t think LLM’s lead to AGI, or are these companies diverting to different architecture behind closed doors?
youtube AI Governance 2026-01-27T14:1…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugwz2ivQONX8sa3gLkl4AaABAg.AT4HO84HBY_AT8gPc4lk2H","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgzSOXtWS_9DS33CN214AaABAg.ASyvyU9QX2IASztrVIf1Xx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_UgyAs1aPiCfu1j11b814AaABAg.ASGCGJ0gwgZAVDFCpxwICt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugx0lbXeExXBIJm7eDB4AaABAg.ARyBi9G4JueAVDHhQ7VMCy","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyOx4qCgvsDbF2EJDh4AaABAg.ARuItSRqukRASFLb-1WeZD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgzinhIp_x1XMCFvnG54AaABAg.ARgT0ISqGr_ARrLvIn1zyC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugw1y2fBzBE3tOKsKah4AaABAg.ARO4jvFvHk5ASUbDtqw48f","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyYW4k45r1Y8A-5S5J4AaABAg.ARMthV7V8F2ARNJ385kGmC","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytr_Ugzoq_28VWSRAlrR0G54AaABAg.ARMmONyRgrNARZZrlb3TmD","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytr_UgybYkKrhYMp3uaF0Px4AaABAg.ARMlMUnz7f4ARrLM90QljF","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]