Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Human therapists are full of shit and charge exorbitant fees for just sitting an…
ytc_UgyqKAZW_…
G
Nahhh cuz I broke the Ai filter on my first bot bro 😭…
(He was not so friendly n…
ytc_Ugw2XfXr6…
G
AI-art = using bots to break a speed running record and claiming it as your effo…
ytc_UgwbMbbu0…
G
1:36:20 - 1:37:00 he talking about getting rid of retiring now, the board of edu…
ytc_UgyeApLwL…
G
calling yourself an ai "artist" is like calling yourself a chef just because you…
ytc_UgzoO0Jnp…
G
Life is very complex
Live and let others live
Means do whatever
In the end de…
ytr_Ugy7SawiN…
G
It might take 50 years or 1000 years for Ai to become self aware but it could ha…
ytc_Ugxsb7sLH…
G
I truly think you're way too trusting of Tesla. They ARE death traps. The auto p…
ytc_UgwH_5oQP…
Comment
It's only a matter of time before it's a possibility to bring AI girlfriends to life in our physical world like those Replicants out of Blade Runner. The blueprint already has been made. You can clone animals for tens of thousands of dollars. If someone were to do that with humans making AI replicants and give them a womb it will not only make women obsolete but it is a form of playing god on a monumental level.
youtube
AI Moral Status
2025-10-12T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwwPr1sAyudYZE0GhF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy0yrQxVqBpxfWGb1B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwMri-yMqrmc6Ie1aZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgwdMLw_PmwPj6ttIMp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzqpTbfuZVPbmlxnM94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugw-6TEcaJx8aTkhps54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgzjX8AorSApRHHvE1d4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyUjBAMnnZWmILWwXN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_UgzK0sTgZhAXUZzWdiJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_Ugzc_U9LguuYl0_qWC94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}]