Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not everything needs to be "art". Sometimes you just need AI generated "content…
ytc_Ugyh76LZz…
G
This was interesting cope. The idea that you'd have a different reaction to an a…
ytc_Ugw54Fzcx…
G
That's not how art works. There is something called fundamentals to art which ar…
ytr_UgzdTnjha…
G
speed damn thing up go to a Ethocracy civilization. use machines give them work …
ytc_Ugy3rN5PW…
G
I love AI and use ChatGPT frequently. But I also fact-check every little bit of…
ytc_UgxBtAqlX…
G
Its an interesting problem, I have been working with this logic for over 20 year…
ytc_UgytqgQTm…
G
The problem isn’t the “taking reference” of the AI. The problem is the lack of c…
ytc_UgxTk6EDb…
G
AI cannot get past stage 6, ever, because of the wicked, flawed, and self-destru…
ytc_UgzG_z6Bo…
Comment
Even Margret Atwood didn’t see this coming. However, there sectors when even the most powerful AI will rely on humans for construction and maintenance of the resources it requires to exist. Power generation and infrastructure to deliver power where it’s needed, for example. It’s need to navigate the world physically may provide some restraint, until it’s able to either satisfy its needs without human labor, or farm humans (or some version of human) to do it. In any case, even if it goes off the rails, it won’t exterminate the human race until it can change its own diapers.
youtube
AI Governance
2025-07-10T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwcIwtcQzHsXcAhojl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzV63t2v6UmrKohA7N4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},{"id":"ytc_Ugx-qxLmq6ptsItyEQ14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgyJNadSymyDZdJYeCB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugya7LI6ovUQh2tfw6l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzXbj7dPvncnjm10lJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugxp2thsfrIBLcHaJwt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_Ugzrw-c6I56S25PaFwJ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgwJqF-Ekrak0NvLXIB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwpkpi2LT3NNsfX2mN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}]