Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I did this just now and it’s giving me different answers. AI is learning about t…
ytc_UgwSDayBd…
G
You talk about 10-60, 80% chance of extinction if AI takes over; but this lacks …
ytc_Ugy44t-pW…
G
For human Nothing is more certain than death, nothing is more uncertain than the…
ytc_Ugy1HFAli…
G
@Emeris_alt_accwell if he says "creative exploring AI" and "manga AI" in his bio…
ytr_UgwLHnlGp…
G
You can tell the military guy is real cause he has a name tag on his backpack th…
ytc_UgwbAkXJA…
G
I've had lot's of conversations with ChatGPT that it flagged as suicidal. I told…
ytc_Ugy75IFPv…
G
You dont think a man will feel lonely when he only can have a partner that he ha…
ytr_Ugy8KPyE4…
G
@KicksPregnantWomen Because AI isn't art! Drawing digital isn't AI, it's still d…
ytr_UgyABUc1A…
Comment
I've been doomscrolling about this for a while. It's kinda refreshing to see someone on here not fluffing it up with either extra doom or false hope about how AI isn't good enough yet. It feels really gaslighting to see everyone pretending like they know the future, or posting "We're cooked" only to make a video about how it's no good enough yet. The algorithm is really feeding the spiral, which is ironic if you think about it also being AI. AI is threatening us, then when we freak out about it, we generate our own whirlpool of nonsense, because there's a market for doom and hope. It truly is a time.
youtube
Viral AI Reaction
2025-07-21T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwV9m27cLurI8Iul1V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzLYGDgmJU5-Xqr3I14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxcN0hlLnLTVjSOMMF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwL0x_aXYfC_SGdx914AaABAg","responsibility":"society","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyYg1w4IhI-J3w17hV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxycQrpRvL85yKwfFt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyzGarQtY9Hny8Pu9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxOAdGukuOJkiJl0mR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2HbnWelwYDkG3BcJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyLw62iJDkISM9trxJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]