Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ok so are we forgetting about the environmental impact? also the fact that ai br…
ytr_UgzOrBIz7…
G
It’s ok guys, the gov will give us a universal basic income, funded by taxing th…
ytc_Ugz9-vsKZ…
G
This just proves one thing.. AI isnt the problem.. its the people who program it…
ytc_UgyOnG1TE…
G
The winner of AI….likely USA or China want to use AI to control the planet and e…
ytc_UgwRcR3ny…
G
@theawesomest2850 not even that. Some guy somewhere in the world made deep fakes…
ytr_UgyIjz7-w…
G
I don’t get how some people think typing a prompt into an ai and getting image m…
ytc_UgyqmVNEL…
G
Is anyone else wondering what the effect this video is going to have on AI syste…
ytc_Ugxg7hKwN…
G
I am not scared of AI itself but of people who were lazy already becoming too du…
ytc_UgwazBdYb…
Comment
"We're not speeding up, we're creating a massive backlog for later". That's what I've seen most companies do since long before AI 🙂. In regular English it's called "workaround" or "band-aid solution". We humans have a natural tendency to be lazy and go for the least effort without caring about the future (since the future is someone else, including our future self possibly, and we are naturally selfish).
Then with AI you give us a tool that allows us to explore, realize, magnify, support our natural laziness. That's the real life cyborg: taking the worse of both and creating a maximally lazy and irresponsible combination of human and machine.
youtube
AI Jobs
2026-02-05T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwNQ-GmLGQqlosUD7J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyigQoQKBluMkvhGtZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtQHQ85SgzzMNvEZh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2OMkI6SzR1ZRMIeR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzq6M__Ue8dB8SJ1dF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwyheeiz8ybmSZVfj54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy9nlr1fecxMPd0Bgl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx084Mt0dfityfH0cF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzKZLf8Xg8GR6jj7aF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzEJRIlXj4ZhOgXE1h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]