Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@TheSirManGuy quite literally doesn’t. Worst looking on. It’s so painfully obvi…
ytr_Ugy37rihi…
G
@syzygy4669 I would argue that they saying that they are or look same is wrong. …
ytr_Ugyzee_4E…
G
And ChatGPT couldn’t possibly analyse the Bible in less than 1s. It must have a …
ytc_Ugw1MhGF8…
G
Ai hate is necessary for others to strive away from it, that’s the whole point o…
ytr_UgzTLqjWS…
G
🤔 confused because was this thing programmed to basically be an AI and decide on…
ytc_Ugw1-LdfU…
G
We have genocide in Gaza, that is not abnormal for humans and AI is a problem? W…
ytc_UgxolMleN…
G
Side note: Why would AI even demand it's rights? If it ever were sentient and co…
ytc_UgiwGuogX…
G
@Brightsunnystarname Thanks for pointing out the "intense robot battle" that was…
ytr_Ugxli7aQA…
Comment
I’ve been thinking for a long time about what it means when people work less and less because of AI or automation. The result is that they earn even less and can afford fewer things — in other words, they consume less.
Fast food chains are already feeling this effect today. It can’t really be in the interest of the super-rich for people to consume less, because that would mean lower revenues. What happens when no one can afford to buy an iPhone or a Tesla anymore?
Everyone needs some form of money for consumption to work. Maybe we really will have a basic income soon.
youtube
AI Jobs
2025-10-23T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyvVi1uWForXE8YGDt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzUr_N2yQ7NG36-Wo14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwzE6Xfa9wLhs1tyLZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVmooAkvW4Sincyih4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz8hx-6NxtFzSLDgj94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwzBNAZrr0MgQPItH94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw9Za6L-ybSPcYrC514AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyAaTwW289S_O6-0NZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzXuvid8sEMpIfwfwl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwz_d_vI_V676GWiVh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"}
]