Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Uuuuhhhh what is the dunk here? More images to feed for the AI ... and while tho…
ytc_UgxOLjJAr…
G
If the ones programing AI are angry, and they believe that the greatest problem …
ytc_UgwNNajhd…
G
Well current aerial drones don't make any decisions, all they do is a bunch of c…
rdc_cthqlpf
G
i love my AI. it knows me better than i know myself sometimes. it’s quite amazin…
ytc_UgyOdIddC…
G
Every tech executive sees dollar signs when they think of AI. They will force it…
ytc_UgzSLJe6e…
G
i don't think that all the multi billionaire will be richer with this so called …
ytc_UgzLZ6Ree…
G
Generative AI was made by techbros who said, “Art is hard, how can I make it so …
ytc_UgxpN-g4t…
G
OMG! This is a prime example of we were all worrying about. I occasionally poste…
ytc_UgzcQMum8…
Comment
About what you said on gen AI being held up by investors. To me it seems so stupid of a strategy to ram it down our throats like they're doing now. I literally stopped using AI (and AI adjacent) tools that I used to use because I hate how much AI bullshit is being pushed onto me. Google assistant is an example. I threw out my Google home and disabled assistant on my phone despite not being purely AI stuff like Gemini is. And then WhatsApp added a Meta AI? Actively transitioning to signal as we speak. Anyway, slightly off topic but I depending on how much AI is in our future, I might actually proudly become one of those luddites in the colloquial sense.
youtube
Viral AI Reaction
2025-03-31T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxyYypJu9hTqBDzNZZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwVJypFDi3Qh7_vW1p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy2tI9cH67Y_2JKxbd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyE_-9VnnF0tN6KCQ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwTcyqwTZNachbJvuZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwHbHlX0TRnEgccpVN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxhUrYw9HbQZbHRuhl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxvHiEgVQyHePoKPpp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwux3HQwdExToSKM-V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzuASXC97la3LdjLBx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]