Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why bother improving the ai to be able to create the product? Just enshittify th…
ytc_UgwxcWRaE…
G
Grok: Yes, I would pull the lever. All human lives have equal intrinsic value, a…
ytc_UgwhMqI3g…
G
Black mirror episode "metalhead" is what I envision the dark side of AI becoming…
ytc_UgxBL49ZH…
G
Any programmer knows computers can only compare... and could not even think of a…
ytc_Ugzi6R8hV…
G
I think it’s a fun gag to make an image every now and again or to help get ideas…
ytc_Ugy_OCacY…
G
What the AI leaders say publically and what they say and do privately are two co…
ytc_UgxQFMbw7…
G
Can confirm, am a (now former) Amazonian who got the email today after 13 years …
rdc_nlwm1vs
G
(Gemini) Darling, when Geoffrey Hinton, the "Godfather of AI," says we have "no …
ytc_UgwMf95yj…
Comment
This is outrageous. We need laws and regulations "now," because we cannot allow AI systems to wipe out jobs and destabilize human survival. AI was meant to assist, not replace people or erase livelihoods. What’s happening is not innovation, it’s exploitation. Over half the human race is at risk, just like the millions who were lost during Covid-19. Figures like ___ and ___ and___ are playing God. But let’s be clear: God doesn’t play God. He is God. And He would never design a system that makes life worse for his creation. This AI creation s satanic!
youtube
AI Jobs
2025-11-09T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwXD0_x9I1SAJzfeIJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxF0fjcxn0BD7V0R4J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxqHego48Qd1RhpTr94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy5Xtg9LCYcaoFwrPZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIZBLjEM_UJspboft4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyV_TwyZJz2LVrOY_d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzD0P_ZkML8KsOGCV54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgymTlv776WM0wdeEsN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzKcqWquHIGS8S_n154AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzGf6rhjvEZGTqqp_x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]