Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nobody will force you to "use an AI" (whatever that means). You'll just slowly f…
ytr_Ugz_sPFHp…
G
people with bad taste make bad ai art.
its very possible to make art out of gen…
ytc_UgwWyYMRF…
G
I feel so bad for the teacher she just wants to make the teenger or adult furthe…
ytc_Ugzw9BZKF…
G
Im a breadman. My company brought on AI to do our orders a few years ago. We w…
ytc_UgynEWXhe…
G
@noynoynoyaI don't see how it would be this cheap sometimes even free if it use…
ytr_UgzpkUSAN…
G
The biggest mistake everyone is making is assuming that what's controlling AI de…
ytc_UgwiYjTWF…
G
Difference is Walmart is an actual business that has competitive advantages and …
ytr_UgyRBVoi4…
G
AI requires no skills therefore the people using it to “create” so called art ar…
ytc_Ugxfvb9Hn…
Comment
All that is good and great, but you miss one point. Previous technology revolution was about automation of mechanical force when this AI tech revolution is about automation of intelligence. The AI technology today in June 2025 is as bad as it will ever be!
Yes new jobs will be created as new problems arise but in which quantity and more importantly for how long before they find themselves automated as well!
You totally avoid the impact of corporate greed, and the social impact of the mass job displacement which will happen more or less all at once everywhere on earth. You totally overlooked the fact that government world wide are not ready for that and the social unrest which will come with this mass job disruptions.
This point of view is not inclusive and might work for a few currently privileged people but the world is more diverse than that small group!
youtube
AI Jobs
2025-06-24T01:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzIp4qqqSKlOhnMoOp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzjSWTU1Yfe24OgyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwCh3z8GUCkFuHiYJx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzBgA-QGX00BGhgPol4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzr3CwytAOWoDIgWpJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgweSEwXAgiKkv3R74l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzSukMrKwuvPnCEiEl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzHuZL0K9iqhLXGH-l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw9-bzBJulntJAi_h54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1Q-lhI_fjr4dstnh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]