Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Google's A.I. got excited when I asked it if I could call it E.L.E. That's super…
ytc_UgwOAErA6…
G
AI is a major unjust threat to the livelihoods of ALL people. It is going to be …
ytc_UgwNm16tE…
G
I wish we were born with a magic gift. It took years of training and practice. A…
ytc_UgzWxgxr2…
G
Software development requires creativity and foresight, two things that robots b…
ytc_UgxT02WaJ…
G
I actually think that AI art has ableist implications because art is one of the …
ytc_UgzfHn83P…
G
@amethyst6489 programmer and engineers will disagree and say the opposite, pleas…
ytr_UgxhgN__c…
G
don't forget megalomania AI, or dwarfism AI, or kronze AI: all of which could ke…
ytc_UgzKWWLZK…
G
ShadowsUmbra
The trend right now is that the middle class will keep shrinking as…
ytr_Ugge9EJa3…
Comment
The thing that worries me the most is that the people (engineers, business management) who "vouch" for AI to replace people are not the people who understand people. The promises feel empty and unbased.
Even if AI could build code, it has no concept of human experience or actual world and use cases. It doesn't understand all variables and conditions. There is a reason why we have management, engineers, designers and consultants etc. If you drop any of those, you can't trust the outcome. Even worse, if you drop all of those as a customer and decide to wibe it all yourself.
This is a bubble. The world doesnt work the way that "AI people" think it does. I believe we will see increased productivity of we can mitigate hallucination and keep up with pace on all fronts.
youtube
AI Jobs
2026-03-09T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxN9z4lQ_Gj8m59ye54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyACrPcM_XdwFcJUE54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgybFtrkgEDWBjKdQdJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9v50rRljX5n6PEW14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyGiQuP6e0Wdr-0FXx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugym3T_kRpaXjyN9cvd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzvP53oeqnzcyJ27614AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw5p6lLQUAtFfdXj414AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCCG4R3HjxNBp6DKh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyzgAmzoDbNTmDkP854AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]