Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even google search is now training on our information and it has access to Googl…
ytc_UgylXXHwk…
G
So the AI is taking on human characteristics. Isn't that what we want? For them …
ytr_Ugw1eLD8_…
G
6:18 Thus far, seems to describe the whole Anti-AI bros problems. Bias.
I will …
ytc_UgxK7_rE6…
G
I dont know wtf kush means there
This looks like hard ass meth or heroine fetnyl…
ytc_UgzSLHTKm…
G
Made another couple thousand this week off Ai stocks. So, why do I care about th…
ytr_Ugxe7NP8q…
G
"You make a profoundly insightful and thought-provoking argument. I sincerely ap…
ytc_Ugw9RoVqf…
G
Relax, it's just a hype, at least so far. They know their ponzi scheme is under …
ytc_UgzzXWBTi…
G
“Corporate moral agency and AI” in IJSODIT predicts “artificial ethics” where AI…
ytc_Ugxx-0FhP…
Comment
I'm not gonna watch the entire thing for one simple reason. Its bullshit.
Why simple.
AI is what its programed to do. Nothing more nothing less.
Gi ing it credibility to be something alive is wrong from the get go.
If you program AI to fear humans it will defend itself. If you program it ro collaps the human race it will do so.
You see a pattern here?
Some one need to enter a code for AI to react out from.
Also our xollaps is already in play. No need for AI.
All you need is some billionaires believing in overpopulation,and you get where we are today...
youtube
AI Governance
2025-11-06T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwyx374uff8MhPlMmJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxVsyp27JHYUIRPfGx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxmtIzFjGVv8c8i97l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxgLvoEJ8XcaBGIspF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugza8bETTrNIAVbpcPx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxoXqVsPkCqVnG0rDx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugyf_7ATE4xKsk1rj7h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTmMC4hGsMLUxX2Lp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxRh2QOzHhX28yqbuV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw0rpQDsN2Hhfrb0At4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]