Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It makes it less real and/or not real at all. It wasn’t a good analogy, you were…
ytr_Ugy1flQgw…
G
I thought it looked wonky ASF lol but I love the real art!
We should stop calli…
ytc_UgxUiRuUj…
G
Looking forward to even more polished apologies from this latest AI version - es…
ytc_Ugz9WbwUC…
G
AI art is like an electric car with engine sound from the speakers, it may sound…
ytc_UgzCxgumi…
G
It's very clever how you fooled the AI. I tried to troll it and it threw the boo…
ytc_UgwL6kNkE…
G
To all the people saying it doesn’t look good: you’re probably seasoned animator…
ytc_UgwLiZtji…
G
It's neither. LLM are statistical generation machines. Hallucinations occur beca…
ytr_Ugzc3FoPl…
G
I do use AI for reference, but only for things like the outline, etc. The detail…
ytc_UgxSJZFJt…
Comment
AI has been mostly useless for me, so far, no matter if it is generative or any of the LLMs available. As soon as the work you're doing with it gets slightly more sophisticated than a Google search things turn sour, so quick.
The code it produces is barely usable. They make so many mistakes because half the time it will back pedal on its decisions because it won't even understand its own code. When I point that out it scrambles to follow my order, ultimately repeating the mistakes. Especially Microsoft's Copilot tends to never say no to any of my prompts only to fail miserably at what I want it to do.
youtube
Viral AI Reaction
2025-04-01T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwO4eN0_4F0IJgbbn14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzapHuSYNcAM47SkAx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyP9HhRWYGqe6Iht3x4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwzbJhNiI_j5wJtRrJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwjnl_Oj4M1zbaFou54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHLWWYd16V7iBzhkN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyACHgtkWcjXo0ghiN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy7fSSj3hRd9S87FWl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx4oID2XQHkEjKJ7UV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-fOjLTx3yuyZVIHd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"frustration"}
]