Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
DO NOT use AI as a friend/therapist/lover!
These chatbots are trained to mirror…
ytc_UgwK83-0S…
G
@purple8289 Thinking is hard, especially the way we do. I wouod wager that some …
ytr_UgznWWQ3n…
G
Well their explanation makes absolute sense, doesn't mean you have to like what …
rdc_n7lfpey
G
I think you would be very interested in a different version of ChatGPT. Loo…
ytc_UgxnpwkLD…
G
Youre trying to create a robot that can learn
HAVE YOU SEEN TERMINATOR
WE GOT TO…
ytc_UgiOlp03P…
G
Ha! Defend? By killing other humans? Like Israel did to Palestine,BBC? Wager how…
ytc_UgwaENeA2…
G
Always use some common sense when using AI.
I know, it's hard for many people.…
ytc_UgwbaIbUH…
G
Ai art sucks, normal people just can't see the lack of details, like how in Ai a…
ytc_UgyV-enPt…
Comment
One of the best books I’ve read lately is Eidos by Felden Vareth. It’s hard science fiction with a strong philosophical and existential core. It explores what happens when humanity transfers itself into a virtual world in order to survive, though what makes it truly compelling is not the premise itself, but what lies beneath it: identity, guilt, loneliness, and the meaning of life when death no longer exists. It offers no answers, only uncomfortable questions, the kind that stay with you. What does it mean to be human? Can an AI feel itself to be alive?
youtube
AI Moral Status
2026-04-02T11:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxwG39YyOIqnq5a0DZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwWhhw98qBmIUQaUZh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJZt9YRUJWHf5r-qd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzCOp2GjQ3WHA-SYxp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwzkTrpLrJN2d-rBY54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQ7Fv3YwtwFG9zY3F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxTGSfVlH5IOhHk9t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyFggYdEWZ3uIsW4np4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5LNl8CymZsqh7pKp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxbU9_z-APDCETHzvB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"fear"}
]