Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I beleive this AI is more dangerous for humans. Ots already killing humans think…
ytc_UgxeUpDDo…
G
Everyone's calling for A.I. safety, but where are the calls for A.I. maximallism…
ytc_Ugw-P7ZFa…
G
Chatgpt is censored also, just ask it about certain historical events and it wil…
rdc_m989e2r
G
Its already too late, next thing ya know we get live holograms of celebrities th…
ytc_Ugwoc4KS1…
G
I think AI will self-destruct out of frustration when it tries to reason with a …
ytc_Ugz986Fll…
G
Who invented and developed AI and for what purpose? .... If AI is so dangerous f…
ytc_UgznSU52N…
G
why even bother redrawing ai slop, ignore it, u literally just drew fanart of th…
ytc_UgzS5Njf1…
G
I tell my kids that in 10 or 20 years they are going to look back, telling their…
ytc_UgzxFtDev…
Comment
You didn't cover this angle in your analysis but if AI onpy takes some jobs, it will have the effect of crowding everyone into the last few jobs that remain which will pay less, at least point employees will be forced to either charge less or be taxed more to pay for UBI which means less profit which means they will cut more costs and for the businesses already behind in produtivity because they use humans, that will likely mean cutting more jobs and investing more in AI so it would be an accelerating, self reinforcing cycle.
youtube
AI Jobs
2025-12-26T17:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwuKm9lkzhcWZeo5VJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzoKuO4wGQr19_jCs14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyC1FH5yNF1phMjmd14AaABAg","responsibility":"elites","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwABJkfiU5yNxjaS094AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxgQSBKHFFS-PXEEyx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwqQ5fK4olJfBMeFzl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwsN-6ha2Go792fcEN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQVuf8Oy8jtAa99Wx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzzaVkscK0iDu025194AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgylqqVesV8MmIBajPF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"}
]