Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A.I is like every advancement and if “for every action there is an opposite and …
ytc_Ugx2kgFgG…
G
Mf if this is you trying to merge my brain with the AI on these terms, I refuse.…
ytc_UgwmyNtLN…
G
Can't do ordinary design anymore as those will be replaced by AI, this will elim…
ytc_UgzYuH4Ke…
G
Creative work is a continuum. There is no difference in kind between using Photo…
ytc_UgwpwEAlQ…
G
All of these men are WEAK,
You’re afraid to confess? You deep fake them
You feel…
ytc_UgzTkixEG…
G
Goole and other AI answers, I click thumps up for the wrong answer and thumps do…
ytc_UgxNVsmfQ…
G
I dotn care if people use AI what is wrong is not stating its AI.…
ytc_UgyA3z-xw…
G
I know how to survive,grow , hunt ECT. But lack money.
A.I. will come to the con…
ytc_UgwP9sxgh…
Comment
What if humans start working on something to prevent AGI from accessing your content and learning your style and copy it. Some sort of AGI blocker by putting an algorithm that can only be unlocked with your fingerprint or retinal scanning. Or maybe we can do it that AGI has to be run by a human and we make our jobs easier and gives us more time to travel and work on leisure and artistic ventures. Our hope for survival is to go back to nature and refuse to use technology and rely on nature and our instincts, going back to the basics. Grow our own food and live off the grid as a last resort for survival. If we refuse to use AGI then we can take away it's power.
youtube
AI Governance
2025-11-15T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwaps6cQNos7oUZkdB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhKAC9bS2_Os01u7F4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyAyGL2cHypaqU0Fr54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2Uw_epKYYpPKpAIJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx401PAgwO3ehiwIHZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzpy7mw5IHA6p8YVzh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxQ5b_VTJeAAB1KJi14AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwIw2KmnuLC4LCHcc14AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy2HsUiQlsiM2ezV7h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwrR3rUstYk4EzDNip4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]