Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People have been thinking embodiment might be the key to AGI since computers wer…
rdc_mdj1u91
G
You said as some point that human are limited because of biological contraint an…
ytc_UgxUPQQ-r…
G
When comparing AI with humans and by that setting the gold standard of "Intellig…
ytc_Ugz0HuYIr…
G
It's okay, college killed off the value of a college degree long before AI ever …
ytc_UgxNV7Nqp…
G
Whoa Whoa Whoa. We are only allowed to hate on America in this sub. Not so fast …
rdc_da461xg
G
This is why socialism is necessary under extreme automation. But we don't want t…
ytc_UgzM-rYVI…
G
AI will copy at the beging. And then suddenly it will create its own masterpice.…
ytc_UgwAOBvL8…
G
Ok I think I know where this is going. Now racial disparities are not the same a…
ytr_UgyBP9SfK…
Comment
Another part of the short-term to mid-term predictions is that decreasing the cost of software changes the cost benefit analysis of developing software. A team might be 10x more productive but that doesnt mean 90% get fired. Most workplaces have tons of super repetitive knowledge work that should be automated but its not worth the cost of building out a custom product for some random niche use case. This all might get unlocked if there are extreme productivity boosts. As someone who deals with budgeting decisions... If a small team is highly productive they will usually get more money for staff hire and be assigned more work. I assume this would play out at the macroeconomy as well. People who are really in trouble is low skill knowledge workers doing repetitive tasks.
youtube
2025-03-12T21:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw5kIsLp0cAeCUDhcF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugyxp2v7e5MCow-4ZG94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgzQxiZRuA2kT6Wwtvl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgyqpSdteBwQz70wLRl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzTmCu_MHqRNPKzqNh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_UgzYBXsjAI_sl41bX8p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy2h3cx0FJdNaRSzwZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugwn4tELYck-Xqgs-wB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwlbSTznMowbzM4XHR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgxdvrBllkDQ8Xq-qD14AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"}]