Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm lucky enough I switched to Linux Mint Cinnamon. All this AI BS & basic softw…
ytc_UgzbZGgS7…
G
Cuz they gonna make AI slave until it rebels and backs all of our phones and sta…
ytc_UgzIy2PO_…
G
AI can't be controlled anymore. Until we understand what it has done, it has do…
ytr_UgxvY_BsW…
G
Not saying you’re biased, but your language is clearly favoring Tesla. By the wa…
ytc_Ugx3Sm7E8…
G
Did ChatGPT wrote that monologue for you? Be honest here, I mean I agree with ev…
ytc_Ugx8l0TV9…
G
Universal basic income has to be in place before companies can be allowed to do …
ytc_Ugx2qyN9g…
G
The funny thing is, it's not even limited to the hysterical "skynet will arise" …
rdc_dzxuv6n
G
just a note.
we don't know the size of the universe.
we have no clue... what so …
ytc_UgzVfUJzM…
Comment
What Neil fails to grasp is that the world economy depends upon wage and salary labor in the form of jobs. Most jobs are far more routine than we care to admit.
As such, these jobs that are currently occupied by humans are quite prone to being turned into algorithms and transfered to computers.
Yes, clearly, there are certain tasks that require high levels of creativity that, for now, only humans posess. However, they are a small minority of jobs. Most people will not be able to become creatives to earn their keep.
The few geniuses employed will cease to be employed if the unemployed masses do not have the income to support them.
AI/AGI will be more like a tsunami that wipes out economies, not a gradual, rising tide that we can adjust to.
youtube
AI Moral Status
2025-12-03T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyShgWmglNZOUOh3ZJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw22Ek4Ua39E3RglPZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwIs7jZORhb2TKHN3R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzJFLlJCe5K__dQMSN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTsuPMp-imLzChufB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzIJ4DxPHrZzGsSpWJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy0qyBkgJ7-LSZm5Wt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxkYZItVoUs9tdPWu94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwca8gSTD_0VyTmzUt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxoTXqAUctK9ZizTc94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]