Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Forget self driving cars - give us self driving buses and autonomous subways / l…
ytc_Ugx65-Vq0…
G
@StupidIsTheNorm if ai tells me gold is worthless I’m still not giving it away f…
ytr_UgzHqHhSh…
G
My employer is asking for these anecdotes too, and so far it's been crickets. I …
ytr_UgwVNneL9…
G
Make perfect sense when I was scrolling on shorts guess what the person said to …
ytc_Ugzb54CIh…
G
What are the chances this website results in legislation to restrict the use of …
rdc_mzmq7lt
G
So basically… everyone’s cheating now? Students, professors, and probably the gu…
ytc_Ugw9auryI…
G
Science fiction was talking about the rise of the machines 50 years ago. The onl…
ytc_UgxbEYGBd…
G
I think AI is a terrible idea, but why can't data centers use a refrigerant in a…
ytc_UgzardHN_…
Comment
Generative AI has a huge fault that I keep saying to my friends and personal circles:
Generative AI is fundamentally incapable of seeing anything in the future or past. It simply says "This makes sense to be this!" and doesn't really think about the area around it. Hybrid creatures made with no real throughline for their biological makeup, words literally being mush in pictures, starting and stopping plot points, generative AI cannot create anything because it cannot exist outside of the present moment by default. Everything else is a "maybe" indefinite decision until time advances to the point they have THAT as the present.
youtube
Viral AI Reaction
2025-10-30T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwpBwyri9uTb-V3x0h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugzc_oz7pBEU0qQUBb14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxBgxKte-BEasQKDPN4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"unclear"},{"id":"ytc_Ugwo2gCYHsWnBYBOngN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugw8TOKtwJOvNFFSlvd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgziQWE-YJn7jd6vdHF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugy5zxeS9vvhelB2FWl4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"unclear"},{"id":"ytc_Ugz5pTjCNvzmssL46v14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzsJzT-bRlPrazzS4F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"resignation"},{"id":"ytc_Ugw63LVj1CO8-4TGaJd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}]