Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is a hard asymptote from hardware limitations. If they make a breakthrough…
ytc_UgxK4SYFI…
G
I expected the robot to emerge from the car and be like: "What the heck, cowboy?…
ytc_UgztetHO3…
G
We need to kill AI now! Just stop all development on it and permanently ban it…
ytc_Ugz56t4Rd…
G
it's just people being dicks. AI art just does not look as perfect as a human dr…
ytc_UgxhL07mx…
G
Is this misinformation? After watching the capitol riot and how f up american ne…
rdc_guq1rop
G
Now. I ain't no pro artist or anything.. but, Bruhh Ai artists are getting mad? …
ytc_Ugzwkd0M6…
G
I agree with everything you said but a major issue rising about how to “spot an …
ytc_UgzjSUIXd…
G
AI will take most jobs, AND UBI will never work. If UBI was implemented, 99% of…
ytc_UgwxRyYtQ…
Comment
@cemdursunof COURSE I’m not a scientist. I’m commenting in YouTube comments section, like you. And I’m not sure of myself, but this, yes. There is no other outcome. Not necessarily in ten years, but ultimately, a human error, an oversight, a highly developed AI in the wrong hands decades into the future, unrestricted evolution of a sentient AI, the merging of multiple AI to achieve goals more efficiently and unintended side effects allowing for developments that are unable to be controlled by humans; there are untold scenarios which could see THIS scenario play out, and all have to b e defended against at once, constantly, and by generations of humans. Fallible humans.
There are some things where being a scientist has no bearing on whether you can talk in absolutes. This will happen. Extrapolating even from a layman’s understanding is more than enough to see that.
youtube
AI Governance
2025-08-03T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgwQ6u7ehCzgYH1LjVp4AaABAg.ALKr7tTmN3CALMaL9CNlQx","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwQ6u7ehCzgYH1LjVp4AaABAg.ALKr7tTmN3CALMbSKFIRbh","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyH8K09Rcdpd5h-enJ4AaABAg.ALKWI1FMaSbALPOJudO8rx","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugyw6HTa6ipw2V_Mf_d4AaABAg.ALKUZ1PVK5AALNsSOydXwI","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugz2w9uW5_dH54Iy_654AaABAg.ALKU817QYHVALNDM8erCTx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_Ugz2w9uW5_dH54Iy_654AaABAg.ALKU817QYHVALNLLjwQXX3","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwtHZTg6Xtd-4QC05Z4AaABAg.ALKSNHC4IW_ALKSyYVkNV0","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugyag-dKEjIRdewgZ214AaABAg.ALKQgzjZ7N1ALKUVxyD5Iq","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugxu69rP8vmtKV5WAER4AaABAg.ALKPNwzSJasALgcWR1PYGh","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyK2hhZrQ2ql3FhJsN4AaABAg.ALKMXhUBO5KALLrCx9bcyK","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]