Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the Nobel prize was not awarded to AI, it was awarded to the people who invented…
ytc_Ugyi_XHtl…
G
I once chatted , and decided on separate chats to be racist against blacks, an…
ytc_UgxUFpi6X…
G
Embarrassingly, I didn’t realize the enormity of this problem until an episode o…
ytc_UgxygePz6…
G
You see guys, a robot can't become self aware, it woud need a shitton of coding…
ytc_Ugwcvb1mw…
G
People defending this are stupid, if a majority of anime and animated media were…
ytc_UgzwlGAGR…
G
I really don’t know much about this subject but i constantly fluctuate between a…
ytc_UgwYPnkgN…
G
I cannot understand this man. When asked if he could go back and not do what he …
ytc_UgxxkW5pO…
G
its hilarious that during a discussion of how awesome AI is. that the closed cap…
ytc_Ugyi9IzYe…
Comment
30:22 The gap will be huge: OK. The difference is that the owner didn't learn from things that Pablo built or thought; the owner never mimic what his dog does to learn how to be human and he doesn't need Pablo's eyes to ascertain if reality makes any sense. AI optimizes to a scenario; but reality is ever changing. In the limit, ALL ROBOTS TEND TO INSANITY: they need beings to ascertain what they do still makes any sense.
youtube
AI Governance
2025-08-26T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy29htWaxqJDB78gQt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwGYLH5aYrwyIkskcF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgweOTT0F05j-9FAtnJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxNGyO4loNUPQeZflt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwhc49xB8f29Y0bMoV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRBOdvVpDVfOrCjSh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZQ5z0fSTwahdr1sl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy8i3mSArc_JP5FQn54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyyyuyTzLAkpe6heD14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxGT3_jPQeL0SR9SV14AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]