Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@markupton1417I’m actually a big fan of AI, I use it everyday for work, and hav…
ytr_UgzlMW-Zh…
G
Girlll what imperfections are you talking about !? Lol ai is the one that can't …
ytc_UgxfV5TIN…
G
Oui. Il est bien tard surtout que la Chine est vraiment en avance dans ce domain…
ytr_Ugykc0Lth…
G
I’ve been wondering if changing the alternative text (text that isn’t seen and u…
ytc_UgyRc5JNF…
G
I know this is a short video but you failed to explain why this is an issue. Wha…
ytc_Ugwk0FfPE…
G
So much garbage… Ai has no desires, no goals , no internal dialogue. They operat…
ytc_Ugw2nXgIR…
G
Any race that makes up the majority would become the default value. The author i…
rdc_luzegx1
G
How is this allowed? How in the world does taking an entire industry that employ…
ytc_UgzjDkNd5…
Comment
Steven, you and your AI expert got this one all Wrong!!!
In a few years, AI will get “unplugged” and thrown away because it is not useful to 97% of the people on this planet.
It’s a toy that captivates the attention of a few high IQ people like you but useless to the real world.
And it is nowhere even near actual “intelligence”.
The greatest AI in the world right now isn’t even as “smart” as a human that is four years old.
You should put out content that is useful to the 97% of us real people.
You have an extraordinary platform. Don’t waste it.
With great power comes great responsibility. Think before you speak.
youtube
AI Governance
2025-10-19T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwHMluy0kJn5blLn594AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0AYObMigx_P66EPd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzonYyRbbkQKlcSgEN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzKEebplAYMBz0uMvt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyoPoJvRDedTM7F0CN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgybWovYb32e7JxmV5t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxTnsusfTkdWI48NhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzUCu4ShSb1Jcph7Nl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy_douuXaThiwuE12t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy4h4BJR_QGRr0AuiN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]