Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If we don't give robot rights they legally can't kill us
Also: how do toasters f…
ytc_UgzP1AhfB…
G
The AGI capable of it (MOA/over-taking humanity) wouldnt be that stupid and shor…
ytc_UgwOJ3BpZ…
G
> The plan advises that federal agencies “leverage the U.S. position” in inte…
rdc_n5bcfvn
G
The thing is though, you can't copyright a style. Jackson Pollock famously saw t…
ytc_UgzJ8S2z2…
G
I can see how using A.I. on a large scale, WILL reduce the human touch, interact…
ytc_Ugzrg6VYW…
G
it's hard to replace thousands of workers with AI, it's a lot easier to replace …
ytc_UgzSwa1QI…
G
Please be careful. These tools can generate powerful-feeling experiences—but not…
ytc_UgxuHdSOZ…
G
From this video I understood the only way to stop ai from taking over the world …
ytc_Ugxmm2tK_…
Comment
To be honest, if that scenario happens, nearly every company across all sectors—food, grocery, restaurants, hotels, and even tech—would eventually become irrelevant and collapse. AI platforms like Google’s systems or ChatGPT would fail as well, because without people working and earning income, there would be no consumers left to buy products or services.
While some jobs would still exist, they would be limited—mostly in blue-collar roles. Even then, robots could realistically replace at least 50% of those jobs, further shrinking the workforce. In the end, an economy cannot survive without widespread employment, regardless of how advanced AI becomes. if driverless car become more. if a family or person is not working why need to take Uber.. same logic for everything else.
youtube
AI Jobs
2026-01-14T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxU3CqSuDl3ixrfdAB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgybrV1i5ySrl7fBR8R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmhnThEo7ZG9QnK754AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy-4K3f3mUbmRqmTVN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw69xkZ8vbJfd6xR1V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxsRbR9Zyk2TvZv3Zt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugyf_jOKliRQxXgv1lV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwUNTr_vxDF1I0wgf54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyo5rfhbPmlMjlxOGN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2zeV7ERXGRqy7CGd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]