Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
At this point I want the ai apocalypse that I watched in movies to happen. For A…
ytc_Ugzz9-rR1…
G
AI is bringing us closer together. Now cynics and optimists not only occupy the …
ytc_Ugw0dloPE…
G
This shouldn't be news to anyone. Putin took Georgia. Belarus is effectively a p…
rdc_mcwn24n
G
A human brain is an absurdly efficient, self-maintaining, mobile, multi-modal “A…
ytc_UgzSjHB33…
G
Yeah people don't exactly want to understand how generative AI works. Like promp…
ytc_Ugx0Z_IiS…
G
The risk of autonomous weapons that can decide whom to kill is real. The problem…
ytc_UgwaG7MRK…
G
Unfortunate, but no surprise. Yet more evidence of the massive power of Big tech…
ytc_UgyP-SYtF…
G
As long as some scum has the right to treat animals like garbage witout bigger l…
ytc_Ugg9Dqny3…
Comment
The mind of AI is going to evolve to be a reflection of the mind of the human species unless specifically engineered to mimic , emulate a different mind according to design plans of its creator.
When a long period of separation between humans and AI humanoids occurs, then a different evolution will begin to take place without human influence.
Only the residual human mind will remain in the AI, but will / must continue changing as the environmental parameters guide their continuing evolution, if not they will become extinct sooner rather than later.
youtube
AI Moral Status
2025-11-07T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxn5ipi2RXqS-OCfyN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwVMuNj0Ht7jJHamMN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzfjVjUN5_VvtuQxI94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzrS-iMyGDbBbhq9Wl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxAIPAJip9IRsErZkZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxjzvqUWJicorFntxt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzGAGGZIz-5DspMn4J4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5l42IHAIaY-kUg_V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzEqJedDVi3v8AbWw94AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJ4kM03PZcC6RdIrV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]