Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
ASI is like fusion: Since twenty years, it will happen soon, except that what we currently have would've looked like ASI to anyone from 10 years ago. Basically, if I showcased the capabilities of the current SOTA models to Computer and Data scientists back in 2015, and asked them on when they would think that we'd reach this level of sophistication, the vast majority of them would definitely not answer by '2025'. Now, I fully believe that LLMs are not capable of thinking, not because they hallucinate often, or even answer prompts paradoxically within the same paragraph, but for the simple reason that they are unable to correctly explain their chain of thought, despite having no problem in communicating with us otherwise (literally called Language models). My point is that even 5 years ago, the current AIs that we have, as flawed as they are, would've looked like something out of a Sci-Fi novel for the average expert researching the field, let alone the average Joe, and even if there's a modicum of chance that this trend will continue, then we're under-hyping LLMs if anything.
youtube AI Governance 2025-08-26T16:0… ♥ 10
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugw5zT_gAeU4284OqNd4AaABAg.AMI8apbH4xyAMIBq6y29jc","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugw5zT_gAeU4284OqNd4AaABAg.AMI8apbH4xyAMIIB98m_5e","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugw5zT_gAeU4284OqNd4AaABAg.AMI8apbH4xyAMIJ2j3vBnX","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzzWcKN6Ll5KO5WKJd4AaABAg.AMI8Ob4O-EaAMNZXkFU10B","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UgzfbP0K-G9VWxIogXd4AaABAg.AMI83iKsIOoAMI8iUpE4i4","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgzfbP0K-G9VWxIogXd4AaABAg.AMI83iKsIOoAMID-_Tn-e0","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UgzfbP0K-G9VWxIogXd4AaABAg.AMI83iKsIOoAMIFlmV9uPT","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxdRRrnpshkRrHDFUN4AaABAg.AMI7uX0AP0wAMILJTIvh28","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxdRRrnpshkRrHDFUN4AaABAg.AMI7uX0AP0wAMINkTUVeGl","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxPHUyCzebXu4oK_Q94AaABAg.AMI7iXadpOtAMI9S-XYTDl","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]