Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The love of money is the root of all evil, according to the Bible. While the pur…
ytc_Ugza1g0TK…
G
I mean im a 3d none Ai artist, Ive never really felt threatened by Ai and never …
ytc_UgxpR4LDd…
G
I’m hoping other republicans besides me can acknowledge this Ai bill is a horrib…
ytc_UgzfVGDxE…
G
This debate is an example of people arguing different things. AI twats think abo…
ytc_UgyTzb0f6…
G
Just like cyberpunk fiction inspiring rather than warning, AI fiction will inspi…
ytc_UgxOPTJJ_…
G
Stop dramatising nonsense. All AI does is predict the most likely response to so…
ytc_UgzkYH88P…
G
However, what I notice is that AI tends to be a little too eager to please. Ever…
ytc_Ugxo-tYJx…
G
You know that you can generate more than one answer with LLM? And all these answ…
ytr_Ugyhn-yGJ…
Comment
One thing that is GLARINGLY missing from this AI replacement argument ... sure, we can make humanoid robots, but at what cost? How can a restaurant with 20 tables afford to purchase these robots and then maintain them. They are mechanical. They break down. They become obsolete. If you think of it all in this context, the whole notion seems silly.
Another example: If I am a landscaping company and I employ a team of robots to mow lawns, blow leaves, weed garden beds ... perhaps (since most of these workers are absolute ding-a-lings) I can outperform the humans, but eventually the humans are going to sneak around and destroy these robots for taking their jobs. The owner would face astronomical costs to maintain these robots after they are run over, bashed-in with hammers, and shot-up with guns. People aren't just going to sit around and watch their livelihoods disappear.
youtube
AI Governance
2026-03-28T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxsaYuHydwA65gMcZJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzkwkWg4FOptRK1O894AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwPa7vTfspmUesTJVR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1-OpVwyztYkJVAMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyqdPvH-XGIbfX90CR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyUAhkxyZ8k8HxJYi54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwk52TMyw_Q1G7_q1J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJgxiOcUu1pqnlr4Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyEow2UU8AD6As_Jc54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzK9m7tB0lVIS2va8Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]