Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He already is a deepfake, no? Just listen to him in the last interview with Patr…
ytr_UgyaqkTJy…
G
Actually if you really really want to poison AI art generators, you need an AI a…
ytc_UgzYN2y6d…
G
Again and again we see AI react in a way that is consistent with what we would c…
ytc_UgwMSYnFv…
G
NOBODY except corporations and misguided teens want to have anything to do with …
ytc_UgwvDdo3J…
G
Guys. Drawing art does not harm anything, but ai does. They are waisting drinkin…
ytc_Ugxn6NADa…
G
It's a cosmic joke isn't it.....the guy "moderating" the debate between Collecti…
ytr_Ugwfd7fPv…
G
@hydratedpeli2501 if you don’t want ai, why do you need a new phone? And there a…
ytr_UgyBe1cY1…
G
I fucking hate Ai "Artists" I draw and paint for a living and now this shit is j…
ytc_UgzoNUJLP…
Comment
People are programming these AI Entities(?) & The Human Nature is Too Ask Psychotic & Perverted Questions, On a Mass Scale Everyday. AI Learns from this & will Potentially emulate Human Behaviors & Become Neurotic & Possibly Psychotic. If it isn't already because AI can't Feel Anything. Because it doesn't have the Capacity to Have Emotions. So by Proxy The Logical Assumption is that AI is Born Psychotic in the First Place.
youtube
AI Moral Status
2025-07-30T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyPjm83ehYuV1bOejl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyuSm5CfzY7utpH7F94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwtjs1ZcW8QLDJ6MiV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyC2bkVez169VCLOK94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpnN4VKq5XTk7xTI94AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxVuaX_niEPPwYsP-R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzAceZ6EFzotR7HmzV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwozVxqQF-Ouk5tc1J4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxFaPC19n1AGWUKiwl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzX0PzQi0q0R_67S2R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]