Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don’t mind self driving cars but imma stick to my big block v8 s and JDMs…
ytc_UgxoP_L0D…
G
AI isn’t killing bachelor’s degrees, it’s liberalism, lack of critical thinking,…
ytc_UgzLdUgqJ…
G
You have to be a complete IDIOT to believe a robot is dangerous when yOU the Hum…
ytc_UgwSLK_Wy…
G
In ten years, I want every boomer crying about AI to apologize to use for spread…
ytc_UgzUSN3Fr…
G
The fundamental problem with this approach is that generalities can't be applied…
ytc_UgxsJ1DwR…
G
a brain is still orders of magnitude better when it comes to energy requirements…
ytc_UgxAwnJj-…
G
Go break AI, I want AI that does my work and gives me free time to do art.…
ytc_UgzjEEwNl…
G
The people trying to use LLMs for these purposes baffle me, as an LLM does not r…
rdc_kp0jiap
Comment
AI has created so much worthless garbage and slop that it's just going to loop itself into training on the slop. Determining an AI product is so obvious to anyone who takes more than second to look at it. It's just so awful and annoying that for every single really well-made human piece of say a game or anime character, there exists some complete loser who dumps an absurd amount of garbage of that character that all looks the same. On Twitter, Pixiv, Rule34 (haha gooner), DeviantArt, ArtStation, etc. you see it so often. Thankfully a lot of sites have implemented or are implementing filters. The one place I really wish I could filter it out is Steam. For whatever reason they allow AI content and require devs to disclose that, but we can't filter it out? Why? SteamDB has filters for it so it's definitely something that could be done. It's just all so tiring and I'm glad I haven't had to deal with any AI troglodytes in person.
youtube
Viral AI Reaction
2025-06-01T21:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwT1Gpy5EV4fZ9HO3p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxU4PQe-0VTjkDVG2d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzmuq96UcpZOH1tjVB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwPH6tiL6Jm1918Xb14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYt4_g9lH9wODT3i14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQbCS7P256tMfj_kd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz13WAow8zBq2lokNd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzmKwbWnvLw2RoZY3x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgzLUFWS9807lMTZ2Vp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzT7zy8qzHiskyb7EV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]