Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even apple another AI advocate says its fucking useless so the whole thing might…
ytc_Ugxz7dBeR…
G
Think about it, in an argument "Google it" became a verb that the regular person…
ytc_UgzpATRR5…
G
I try to use ChatGPT almost daily to help me code small bits / routines.
Out of…
rdc_m84hzxi
G
Everyone wants to get paid, that is clear. I still don't see why you cannot comb…
ytc_UgyVaPsTH…
G
that's the point, it's really hard for humans to tell but AI get really stumped …
ytr_UgzblJjm4…
G
Rich people are so excited over AI, that they’re fantasizing capabilities and th…
rdc_m54z54i
G
So far statistically Tesla autopilot has caused more accidents than it has preve…
ytc_Ugz8lR5FN…
G
But this is confusing to me. How old was he to believe ai is real to believe?…
ytc_UgyNwNvze…
Comment
We shouldn't have AI... why?.. just to increase profits and increase productivity? Jesus Christ, is that all we are? Just batteries to fuel the economic machine, to be used, to make money for others and replaced when there's a cheaper option? These guys think they're so smart creating AI's and optimizing efficiency..
Just because we can, it doesn't mean that we should..
you have blood on your hands.. you and the ai you created will kill thousands through autonomous war machines powered by AI.. you are against the human race.
This is NOT HUMAN DEVELOPMENT. This is HUMAN REPLACEMENT.
We never asked for AI, you forced it on us all. AI has no regulations...you just opened pandora's box, let's just hope bad actors have mercy on the average person
youtube
Viral AI Reaction
2025-07-25T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxNJpuxG2NFr2A0A294AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzuFS1Uoi5bJ2yRss54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy-tTJILwUKwp1zVyN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyX1Y0omfS3H0x5stB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxsoeGwQacRgb1e5rV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"amusement"},
{"id":"ytc_Ugzhr1fodJoKXrlcN8V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwZuAlvpKKLuK0qbfF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzJMGDCKPGoYrhJu1p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdBeI1o4ZeKNQGw5R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwnmh5Dev6Dz208iQR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}]