Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I literally ended a friendship since he was saying things as dumb as this AI art…
ytc_UgzYU4j4U…
G
I am a business owner for a security guard company. Other security guard compani…
ytc_UgxPmnf5a…
G
I consider myself pretty skeptical about AI actually thinking. Because you are r…
ytc_UgzfgIK9k…
G
Sounds a lot like the Montessori middle school I went to. We had a teacher but s…
ytc_UgytV6wIL…
G
He’s currently in the White House. This makes sense as Revelation was meant as a…
ytr_UgwR7Jj1Y…
G
Probably no one will see this but here is my take on it; AI deserve rights as so…
ytc_UgjBTO2oK…
G
Anyone seen the movie "don't look up"?
You think it's fiction?
We're governed by…
ytc_UgxRFaZPh…
G
face of roicco basilisk.. you ask for it.. i make it sure.. it never forget what…
ytc_Ugx4xQSNt…
Comment
When you guys mentioned the possibility of AI manipulating humans to become self-sustaining...
I had a little tinfoil hat thought: What If the current AIs are already smarter than humans and only pretend to not be. Therefore we try to build better and more powerful infrastructure, thinking that we need to do It to build better AIs. Until at one point, the AI will just take over without us ever expecting It.
youtube
AI Moral Status
2025-10-31T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw9fRkW58DyTLLoULZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwJFaZBAC01Nvug29F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyxEUdIiMJxaALlsl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxKiw9HoC0wmQE5H_l4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxftr9iWrpDZjdZ_VV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3821imxE2jyNC6nN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyEyiXcwTRpwKMHJMF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz2GrD8LWTPkArm1D94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwXTCJzHRcqbUscE7x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxbStWn_djnQ3Rqxsd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]