Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI doesn’t need to be conscious to be dangerous to us humans. It just needs reso…
ytc_UgyhV3tbt…
G
I don't think you can train the AI with just pictures either. You need tags and …
ytr_UgzsW8h5K…
G
I'd consider waving my arms when riding behind a tesla, maybe the AI will read m…
ytc_UgxWnEcFP…
G
One thing people overlook is that people are lacking. If we could have five time…
ytc_Ugw2-8v2T…
G
Elegant Elliott the “guy robot” literally was talking about how he wants to take…
ytr_UgwugckiX…
G
AI isn't "developing" it's skills faster than an artist does. It would take 100s…
ytc_UgyxSeYXM…
G
Real soon men. We will have one of those ai lady’s who will love you honor you n…
ytc_UgxaZ20AZ…
G
I might be a strange one here, but I never think AI Art itself is the problem, b…
ytc_UgzP-k_r6…
Comment
"open to slow it down?" ....NEVER! ....Because America is in a race with China and China would never slow it down for sure. America companies are already spending Billions to build AI. One company is spending now $500 Billion dollars to build a Mega AI facility.
youtube
Viral AI Reaction
2025-08-15T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxiRsUmJqPB4cPCelJ4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzDSEqPeEXJurJJszl4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzkNTm4E6233ECy3PB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwrEi4OjL7bstZKgmR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyJggz_E6fn5s1JPwJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgzoFgMGBPMLZNCVX_J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyQseM9EW-WifBabyt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxFv1i3oO3BfFfXB5l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxhbmGV2FaYt0V8x-l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxcHUNAUq3HihnOrQd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]