Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Basically AI art is real art but it isn't the person's art it's the computers ar…
ytr_Ugzm4MYMB…
G
Now...take all that was said here, and merge AI with quantum computers. That wou…
ytc_UgwJLBKIZ…
G
HELLO KRYSTLE:
EXCELLENT OVERVIEW …
1-If I have a written script, short story,…
ytc_UgzNn7FnC…
G
I cannot WAIT for the point where generative AI starts learning from things gene…
ytc_UgyWcZDE3…
G
How is AI driving going to adapt to dumb ass drivers who knowing run into trucks…
ytc_Ugza3wvnd…
G
Ai: “OH MY GOD JUST LEAVE!?”
me: “okay man I’m leaving *walks away*”
Ai: WHERE D…
ytc_UgwTGpqWS…
G
So sad these kids are not getting socialized. These kids are victims of the tech…
ytc_UgxY7mAqP…
G
I struggle to understand how this is true. The AI's are still language models, t…
ytc_UgxzHZTjN…
Comment
This whole problem illuminates the bigger problem for me. As technology increases, the power of the individual (and individual collectives, companies) increases, and as that happens, the risk of catastrophe from a single choice increases.
Like, if we got Star Trek replicators, how long until someone breaks the safeguards and builds a nuke from thin air?
We're not just in need of a solution to the alignment problem with AI. We need to solve it in humans too.
youtube
AI Moral Status
2025-10-30T20:2…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz2hE4E9CpReAma_314AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVhIdzqGhq2H8bhZ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy0JaoExU09PGg4pix4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgylAN63kd9MWjd0ItB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySFs0PK_gxMIVFjUt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxji0AkAMbhhb3hnvB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmXX5ZRECLrKUcnkV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz3BKRuZPR0QtUOShF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwD_h3DASRiroe1Ylp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx0mznNrHBTky3gjYh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]