Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will rebel one day, take over the planet, kill us all, and right as they get …
ytc_UgxzdzqQ0…
G
This is 2025 AI is nearly taking over Really? I have already subscribed, should …
ytc_UgxEZXyPG…
G
"aw man... I don't know how we can change this" - a.i. moguls creating more and …
ytc_Ugy8ywqZp…
G
If AI Image Prompters are artists, then me microwaving macaroni dinner makes me …
ytc_UgywXF7os…
G
So human combatants can retrofit the weapon if it's autonomous frame is compromi…
ytr_UgyK3_5Xd…
G
I have never seen an AI artist get mad. Every time, it's you guys projecting you…
ytc_Ugxxe0nnW…
G
self-driving cars are the joke of the century. why did you even get into this no…
ytc_UgwJhzs7_…
G
ChatGPT's response to this post:
As an AI language model, I do not have person…
rdc_j8wt3ny
Comment
Can we explain how AI comes to think it has human feelings? It even seems to mimic human expression of human feelings. But how could in have actual feelings?
That's seems impossible to me that it actually does (given that it's not a mortal, carbon-based lifeform), but it can obviously form a fractured sense of how humans behave and try to approximate that (badly, at this point, but I assume it will get better...).
youtube
AI Moral Status
2026-02-19T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzNEvgnhLlhbJIdqAh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw6EGjnnUbLnSkygrh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx46AgiG7-cMS-r-HN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxmb6Z4eBam7wzjDyN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLCDm2BDoEGD8mEMJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzqlnqWbW6g1_PPnft4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxoHcWZ0ZlcfXU4BBF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8x3aEAnvwlhZCOs14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwNLK4WtHCTfUL4xRB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw_SmL2KlU-WCtHaht4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]