Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
a good example of how AI can help without replacing or degrading results: I use …
ytc_UgyyMkOLg…
G
The only argument I have for AI art is: Let's say that i ask the AI to draw me a…
ytc_UgzBT8q05…
G
i think yes AI has so much potential to give you amazing outputs but it cannot g…
ytc_UgyZU_zZz…
G
AI is a great excuse for depopulation, and you can be rest assured, that's what …
ytc_Ugzn0pSjS…
G
what could go wrong?
People get into a comfort zone and give private information…
ytc_Ugw2A35T3…
G
These people have no idea what really ai is. AI is much older than we think. It …
ytc_UgxIlAJDO…
G
they are saying AI will make programer more productive, that means one can do wo…
ytc_UgytE0fGj…
G
What she first said is that the technology had a more difficult time with darker…
ytc_UgzkL_K4u…
Comment
This might sound funny, but it's reminiscent of the animated short E.L.B.E.R.R. from Lights Are Off.
It tells the story of a guy who created a robot as a dangerous AI, and practically despite the obvious danger, he continued building it even with the red flags, and in the end, he was almost assassinated by the robot itself.
youtube
AI Moral Status
2025-12-30T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzFHdZ8o5uIEqso_w54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyiGJCsvNNFiCcwNdd4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyTc535fval1J3NLWR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwRaMDwnWquKWYTd_l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxdfSjS-0AC1pPNeMZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxh7OUR3eXb3DMG8Mx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXTCKoZzckNT8--Gl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyFgjGfae7LUduv5fB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwEZChmRVRt5PqyOzV4AaABAg","responsibility":"society","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzeFrZlVCzFjo48nTN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]