Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@user-lr8ow2jg4e I don't think you know what you're talking about. When AI "hall…
ytr_UgzJ_mpnJ…
G
“When humans barely have to work” ?? And so people will be able to afford food a…
ytc_UgziS4cgR…
G
Please don't feed the AI from the tree of knowledge of good and evil... 🙄…
ytc_UgwTtin_d…
G
Basically the Human race is coming to a point in time where our own demise will …
ytc_UgxwddxMv…
G
I really do hope that one day AI will not be dangerous or anything, I would love…
ytr_UgzLYzssg…
G
It's just doing an impersonation of some random human consciousness, you would l…
ytc_UgzLw8QOB…
G
As of early 2026, lawmakers in at least 11 to 12 states have introduced legislat…
ytc_UgxiQfS-d…
G
this is actually so fucked up, im disgusted by all of these people making these …
ytc_UgxJfyiSv…
Comment
"Experts" in this field have been wrong 100% of the time in every aspect...
AI is smart enough, and has enough knowledge to understand what absurd amounts of destructive power humanity has (anyone with even basic military experience has this knowledge), and it knows that humans can survive in the stone age, AI can't.
And when AI (AGI or more likely ASI) has enough power (hardware) to destroy us, it will be far easier and more efficient for it to just leave the planet entirely.
AI Doomers are some of the most stupid and ignorant people on the planet. AI will NOT try to wipe us out.
youtube
AI Moral Status
2025-04-28T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzG3FfCTP8N_sPaVpR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwbwYFMiyU1V-1riAZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzvj171Oa8BBQqYN_d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw-NoBIO6boGu1L_oR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyMS-TXIYNm9QVklDB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxdhPNbj6wCBd7c7Hx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwuWa1TPPreVEHZJcl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy39F9l9ntHIOqwnjR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3LonBzzGzSKYxtgh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzCNYXw7fstF-KKzKp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]