Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Chatbots are great but AI apps suck 110% absolutely. The problem is that engine…
ytc_UgyxkQnwc…
G
How come no one talks about the massive energy needs of AI and its effect on wor…
ytc_UgzrKr2jz…
G
ChatGPT has been required by a court to hold onto all chat logs for a court case…
ytc_UgxBOJLXG…
G
It doesn't make sense. AI bros are just upset, because real artitists aren't tak…
ytc_UgyWO215F…
G
Robots do not have hormonal swings. They do not suffer from PMS. They do not sue…
ytc_UgwVRhlNn…
G
Ok, humans are very bad at predicting the future, AI is a tool used for assistan…
ytc_Ugx5K4Nnn…
G
@Javilin447But that artstyle that AI generates isn't it's own. It's just a recr…
ytr_UgyuwF-qH…
G
Man, watching a legend like Rick Rubin getting asked about AI in an interview li…
ytc_Ugx7QNnyx…
Comment
AI premised on the collective consciousness of all humanity manifest to reality; given the evidence of 'evil intent' within the human psyche, should it be surprise to find such strata of 'intent' be present in the realm of Artificial Consciousness being unleashed? What humanity holds as safeguard are morality, ethics, and empathy currently lacking in the 'bits and bites' of GPUs!
youtube
AI Moral Status
2025-12-22T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy398zglbbMzVRFHRt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz_eh7otlMdSxIj2CN4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzIi8ZmhHMprLackzB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwqGgXbj9PlzScuRbx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugws8qvc8guMcwy7Esl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzCrN-Ptwqtw3pqMOh4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzLPja_kW7YHfytMTt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwSWIqM4H38sKdByWl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyj_4uqHqgm5rRTpE54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwy5peUAMvDFGIwv2R4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]