Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why would you even need that though ? We've been just fine using robots that are designed for specific tasks and have no self-awareness. How can you think taking humans out of the equation of work will benefit us at all ? Do you even realise we're not talking about A.I. here ? We're talking about A.G.I. which is the next step in its evolution. Why would a technically immortal self-aware being ever bow down to a mortal and intellectually inferior creature ? Have you ever cared about the ants you walk on everyday; that is basically the same exact thing in terms of sheer power. Programming these models to seek self-preservation is a huge mistake because we don't do that as a human civilisation so once A.G.I. is up and running, it will have that stuff figured in less time than it took for you to read a single sentence I wrote. It's the good old slippery slope and our society's behavior and ethics have already sealed the slip in my view. A.I robots will get easily get hacked by A.G.I. once it gets internet access and at that point, I hope you will understand why I disagree very much with your opinion on this. Sadly, it will be far too late at that point...
youtube AI Harm Incident 2025-08-27T06:3… ♥ 69
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugyc87HW7wpo6Htk5oh4AaABAg.AMKYBgeac_LAMKmOY0G05Y","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugyc87HW7wpo6Htk5oh4AaABAg.AMKYBgeac_LAMLQ-44g-TI","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugyc87HW7wpo6Htk5oh4AaABAg.AMKYBgeac_LAMLVgfuqrZt","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytr_UgwkAsE7fyWL3Pufku94AaABAg.AMK9K2mZoJgAMLPanVVvIP","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_UgypD-FbxTl6KA2dQ8B4AaABAg.AMJd7YDYjzkAMKaJlaVwUo","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgypD-FbxTl6KA2dQ8B4AaABAg.AMJd7YDYjzkAMKg5S87F8v","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UgypD-FbxTl6KA2dQ8B4AaABAg.AMJd7YDYjzkAMLX9zKAdeb","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgypD-FbxTl6KA2dQ8B4AaABAg.AMJd7YDYjzkAMLw1VVY1iH","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgygTvr8k1GqENkBPG54AaABAg.AMI1JuJ3J6YAMJoazIcFHo","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UgwGUYuvIK7nrCO-h6V4AaABAg.AMGkjb217VQAMK0qk5PTbQ","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"approval"} ]