Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All manufacturing, all ground transport, all retail, all base to mid level tech,…
ytc_UgxmfFcl1…
G
Are you a teacher? I think you've missed the point. Kids don't have to learn wit…
ytc_UgypXyqs_…
G
So would you build ur own llm, and then use Linux with limited Google apps or pu…
ytc_UgzfFU4Ii…
G
This kind of simulation completely ignores what AI is doing. Every simulation of…
ytc_Ugw0EDbRM…
G
Yeah with big beautiful bill no AI regulations for ten years we going big brothe…
ytc_UgyKcXvGN…
G
I would simply not allow AI and, if need be, computers i the classroom. I would …
ytc_UgymsAWGD…
G
Why are people so scared of ai like jesus are you people scared of progression?…
ytc_UgwxKG_BD…
G
If the rich can't get richer because the lower classes do not make money because…
ytc_Ugz2zaGom…
Comment
With AI, the mirror is turning onto humanity, and it is becoming harder and harder to deceive ourselves into thinking that we are the be all and end all of intelligence.
Pure zero-work consumption, now being within reach, has begun to show how animalistic our species is and how we merely want to feel good forever.
The algorithms that control social media are designed for retention through our two most potant feelings: hate and like. You react the most to things you hate, and to balance the hate it feeds you things you like. And most people are okay with this.
Why do I bring this up? Because both AI and algorithm-driven social media have something in common.
Zero-work consumption.
The feed gives you what you want, along with the AI.
It feeds our animalistic urge to feel good.
To _feel_ good.
youtube
Viral AI Reaction
2023-11-07T16:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxyyCgrYZK-pVt7Nyl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxXsngLaaVxcQf4U6d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxCtmdWmU4E-_0Bflt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxwzKMZzqLdcKze0XN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugziv9YERcRRV-R1LcJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwh6cZTo7MwKEQ5QYl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0DYoNEmZ5W8mDBfx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzVrIrtOdVKUaV_AYN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy70lQUf5HEyBjNs5l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxFjTRKuKMkO586SNB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]