Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Most purchases have been made by A.I, (in terms of the kinetic flow of money) wi…
ytr_UgyzzHc3e…
G
Weirdly our own brain can become "AI artist" (personal experience) - sometimes i…
ytc_Ugyrwwjgd…
G
what if robots demand rights? turn em off and remove that function from their AI…
ytc_Ugi0YkzZo…
G
scratch artists, designers, writers & musicians ect being taken over by AI. not …
ytc_UgwRjeocc…
G
AI is far better than all coders out there, it just it needs someone to control,…
ytc_UgwDj0doX…
G
I don't think Grok was the AI doing it. I think someone fucked up big time and t…
ytc_UgwkwtuNy…
G
Pro AI art people keep saying it won't actually take people's jobs, now it's act…
ytc_UgyFxcRld…
G
POV AI takes over humanity but only after it realizes it's being used to further…
ytc_UgyRjESLv…
Comment
The problem with Neil’s perspective per his referencing of past examples is this:
Today, A.I. with automation are exponentially evolving at a rate that people cannot match or exceed. Basically, any new role created as a result/byproduct of A.I. (or what have you) will undoubtedly be quickly taken over by A.I. and or automation.
Insatiable greed will be our undoing.
youtube
AI Moral Status
2025-07-31T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxDwJxsviz873aqH-V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMAIbiee_l3jFVEjZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzU5jflk0VRHvPYeDt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwEZnAwT_ngVx1ahIB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwtFThDM9gSq1FbW8R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxSnfxVBB6Jj3nLBuB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzVJw3dmB5dftqfhj54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxqLpIumeTYlfwoiFh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzIlHya3EIHHQRHJaV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6YXAJHzE0jPyb3gB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}
]