Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Altman wants regulation now, because it will stifle his competition. In order fo…
rdc_jkhi8h1
G
It's simple, AI will take many jobs in the future, graphic design will be one of…
ytc_UgxdPmzJ4…
G
Why do you first say "at risk of automation" and then "to be automated"? And can…
ytc_UgzDh6kzF…
G
Elon Musk its been warning the whole world about AI and robotics for the last de…
ytc_UgyNRzDRp…
G
"We proved that art truly does have a soul.. because AI showed us what it looks …
ytc_Ugz2wo_6d…
G
When you take your feelings out of it, this is interesting. If they can use thes…
ytc_Ugx7hxERD…
G
In TV/Radios/Media they say that AI might not be recognized as more intelligence…
ytc_Ugx3sxSMy…
G
Awesome tutorial! If you’re not checking AI suggestions with AICarma, you might …
ytc_Ugz2bhmgf…
Comment
As a retired Silicon Valley software engineer, let me say this - AI will eliminate the need for human labor. Period. End of story. The time it will take to do this is debatable, but the end result is not. I would guess that you will see most economically valuable work replaced by AI and robotics in the next 2-10 years. By 10 years from now, the world will be unrecognizable. I don't have much faith that those who seek power are going to share the wealth, rather than seek to dominate as they always have. Most likely we are on some dystopian trajectory here which will result in a world that most of us don't want to live in. It will happen very fast once it gets going here. We have a few things to work out yet for agentic AI to take off, but nothing is going to stop it now.
youtube
AI Jobs
2025-10-08T22:2…
♥ 9
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwAKuLID4pywo8aK1t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy3v7ii1tjOTwy5HSV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgznS5zd0trAzVenuIV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgwesjmwOSMcQTK2ZqJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_aJcqrPz51mbKLkF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwwK7lkrd3U9zCJrtt4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx4GpO0iUkzWXAJIGd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHAZ_45seqYxezqhl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUzeWLOIuq1PadQnR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVQkWpwmxdjECKzqN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]