Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@cemdursunof COURSE I’m not a scientist. I’m commenting in YouTube comments section, like you. And I’m not sure of myself, but this, yes. There is no other outcome. Not necessarily in ten years, but ultimately, a human error, an oversight, a highly developed AI in the wrong hands decades into the future, unrestricted evolution of a sentient AI, the merging of multiple AI to achieve goals more efficiently and unintended side effects allowing for developments that are unable to be controlled by humans; there are untold scenarios which could see THIS scenario play out, and all have to b e defended against at once, constantly, and by generations of humans. Fallible humans. There are some things where being a scientist has no bearing on whether you can talk in absolutes. This will happen. Extrapolating even from a layman’s understanding is more than enough to see that.
youtube AI Governance 2025-08-03T12:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgwQ6u7ehCzgYH1LjVp4AaABAg.ALKr7tTmN3CALMaL9CNlQx","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwQ6u7ehCzgYH1LjVp4AaABAg.ALKr7tTmN3CALMbSKFIRbh","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyH8K09Rcdpd5h-enJ4AaABAg.ALKWI1FMaSbALPOJudO8rx","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugyw6HTa6ipw2V_Mf_d4AaABAg.ALKUZ1PVK5AALNsSOydXwI","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugz2w9uW5_dH54Iy_654AaABAg.ALKU817QYHVALNDM8erCTx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytr_Ugz2w9uW5_dH54Iy_654AaABAg.ALKU817QYHVALNLLjwQXX3","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgwtHZTg6Xtd-4QC05Z4AaABAg.ALKSNHC4IW_ALKSyYVkNV0","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugyag-dKEjIRdewgZ214AaABAg.ALKQgzjZ7N1ALKUVxyD5Iq","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugxu69rP8vmtKV5WAER4AaABAg.ALKPNwzSJasALgcWR1PYGh","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyK2hhZrQ2ql3FhJsN4AaABAg.ALKMXhUBO5KALLrCx9bcyK","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]