Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well simply put. Ai will not make anyone richer because like everything else in …
ytc_Ugz3aPk3a…
G
I am an nursing student, worried about being replaced by AI. What do you think?…
ytc_UgwkgO7ew…
G
“Ai” is NOT good at code. It hallucinates, lies, and can’t do anything too compl…
ytc_Ugy9ZuiK4…
G
If AI uses the internet to learn doesn't it read news articles about AI going ro…
ytc_Ugw6NRFr4…
G
Hi! As a disabled artist, it’s actually insulting to say that generative ai help…
ytc_UgyTx_ErM…
G
AI is just the scapegoat
It will be the humans weilding the AI that you should…
ytc_Ugy2umrQ4…
G
Yea everyone I know who is like creepily obsessed with AI bots are women. The mo…
rdc_ohymumz
G
What AI has over us is objectivity in these cases. We, well, most of us break ov…
ytr_UghA63n1S…
Comment
If anyone remembers Conway's Game of Life which was software written in 1970 where a few basic animation rules when let run created what appeared to be a visual representation of living cells. Obviously, this was not really "life" but illustrates how you can make an artificial imitation that triggers a pareidolia like reaction from humans. LLMs are like that on steroids. There is no "thought", "understanding" or "consciousness" in this software any more than a calculator "thinks" or "understands" 1+1 = 2. People who discount the higher nature of life are just fooling themselves out of a desire to believe that they can ultimately be like God and create life too. They can't and never will.
youtube
AI Moral Status
2025-11-01T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwDx3DQjiqU2qJG6FZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTK6k8Aqw9vNPIK-94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwei_7KP3azDFb_-Pp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyjvbECDnG4bkxbxWB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxQrs3xC8lMDghTtEV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzVkOt8_Xb97UiZNcJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzUgLam1hNwDO55mjN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTrEIy5Yb9WlaNc6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxVPdJuAHQIJOjuimN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugych_K1BB1AgP2OzlV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]