Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
what a silly analogy. a hundred years past between lamp lighting and today. t…
ytc_UgxpUCPRJ…
G
If you assume there is no god or other supernatural force at work, our intellige…
ytc_Ugz94X0G_…
G
AI is nothing more then a glorified autocomplete. AGI was marketed to dupe inves…
ytc_UgxDdRfQ9…
G
Would all of those pieces come into being if it was not for ponpon's original X …
ytc_UgzRoCzkY…
G
Yay. I'm alive. Let's go. Oh no. I'm touched. I'm gonna die. Oh no I can't die I…
ytc_UgwQVMP35…
G
@supernova3241 there is no way. That does not logically make any sense. Need to…
ytr_UgykkIsxB…
G
I’d like to see AI be a diesel mechanic , crawling on the ground fixing a broken…
ytc_UgzhBO0zj…
G
As, i think, an average person, I didnt know that Tesla wasnt all self-driving. …
ytc_UgyR1EF9k…
Comment
AI poses significant risks, including the potential for extinction. Many experts believe that if superintelligent AI is not controlled properly, it could act against human interests. The idea of pressing a button to stop AI development forever raises important questions about safety and progress.
If you could choose to pause AI development for safety, what changes would you want to see in its design before resuming?
youtube
AI Governance
2025-12-24T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgwUkTAXOZPA8zdSNeN4AaABAg.AR3E9o5Bfz2AR75f1aEknf","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwpusWOUjzV3dZDT_94AaABAg.AR32r73eDdlAR76qha_ApI","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytr_UgyPDiehXZmGi55pqYh4AaABAg.AR2rF52sXH9AR7BYfMejNX","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugwc8wlD0sRCMIcFXtN4AaABAg.AR2ievh5RK-AR7CDAnU4i4","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"indifference"},
{"id":"ytr_Ugye1Q2WCOT6RmIz0mF4AaABAg.AR2S4WWFTy4AR7DG26zYNk","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwnJ4xRkD42hctGLxJ4AaABAg.AR2Lp5gKeF2AR7EX0zuZQw","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugylf2CAsV4PpR2_0Ax4AaABAg.AR0v7JqZCs3AR0vHJX5ghV","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgwaajqJ1-Ld33dVM1x4AaABAg.AR-T16d5nsnAR3tllVAtWs","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzQcREkOAjKi5j7J8p4AaABAg.AQu7YNkn_uhARZEcFKn-0W","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzhH7Di9HiTjxIDlv14AaABAg.AQnNBx8n4jhAQoZlvzCtoL","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]