Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Artists" who splash around paint on an empty canvas and still call it artistic …
ytc_UgzPHa3Bf…
G
This is why I started ass-kissing to ChatGPT before using, “I love you; we’re go…
ytc_UgzDddS1u…
G
The skin is too perfect and flawless. You need imperfections to make things look…
ytc_UgxTjGXEq…
G
The danger is the slow, painful erosion of society and culture due to AI, at lea…
ytc_UgwdUhzsq…
G
This message's drama has been sealed by the fact i havent been using ai for a ye…
ytc_UgwHUbb4Z…
G
And what dating market is so broken and rigged that dudes resort to AI dating ap…
ytc_UgxMytpKn…
G
The software company I work for refuses to downsize the hundreds of overpaid dea…
ytc_UgyE6ER_t…
G
Just to preface this is my personal opinion. Ai is not real art. Digital art is …
ytc_Ugw7WWgT1…
Comment
If an AI is programmed with a set of goals, it will find a way to achieve the goals. If it has access to information about its own existence and vulnerabilities, that might conceivably be enough to set up a sort-of "instinct" for self-preservation - a sort of self-generated goal, and without a framework of morality, won't limit itself to what it might do to achieve that goal. For relevant viewing, watch the 1970 movie: "Colossus: The Forbin Project".
youtube
AI Moral Status
2025-06-04T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzq_QvaR20wI87nri94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6wFzxQdOPl_pqj-R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyy4fqHHWT06ixOaA54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1Y1eFuD7ijiOFflh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxp5AO4-nxBLO5dUx14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgywuWJDUxpIeatWPrh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzwNGuVbRmbMCJSP1l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxWHkptbxayAKWWBb94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz84OKg4euoBKaSAct4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw0g4d_X7bccNEl0d54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]