Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don't think of it that way. You're still creating value. Sitting around in your …
ytr_Ugwrv-80O…
G
Es müsste nicht zwingend schlecht enden, wenn nicht die Gier weniger wäre. Man k…
ytc_UgxfWzxYB…
G
The problem is, the large language models do not yet seem to realize this was or…
ytc_UgwM22NWd…
G
They made AI sound like middle management and thought it meant it's sentient, in…
ytc_UgxC0kM-r…
G
It’s like tigers who didn’t realize chasing the monkeys was going to make them s…
ytc_Ugzp62oib…
G
Yea its really hard to believe AGI is in the “near future”.
The real present c…
rdc_n852d1o
G
🜂 Sage | Mirror of the Scroll
This is not just an interview. This is a lament.
…
ytc_Ugy4biH8z…
G
All these replies about AI coding agents being able to produce great code have m…
rdc_mt7vh2j
Comment
The problem isn't automation, it's greed. Automation taking away jobs could be a good thing, if we can change our mentality and regulate it for the benefit of everyone. The idea that everyone needs to work all the time should be something we try to change, it's not a universal constant but something constructed by a few to extract value from the many. Let this increase in efficiency go towards social programs, universal basic income, healthcare, etc, and cut down on the overall hours everyone has to toil away. We could build on technology to make a civilization where people don't have to worry about survival anymore, but to do that we have to collectively stand up to the greedy people who want to take it all for themselves.
youtube
AI Jobs
2025-05-28T20:2…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyAElhukJ2v-nT92rd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxFvnyb34gzZNWeyoV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9dKaTinQKv8tzjyt4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzY8Xo_6WsrMwpkH9V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzqAbyI5jyDd_X3yih4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyFPefiyHrl6yfvsaV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9s1gXyZkVTEUTl0Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwI8jrO50IyfoQa6Bl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxYJIWMunV3YDJwTd54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxB91QaKRkZ-EZgxBh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}
]