Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I honestly don't get a lot of these arguments. Why is "art" only "art" if it's b…
ytc_Ugw3_w7mZ…
G
We just refuse to learn. An AI system literally tried to blackmail a worker beca…
ytc_UgzVz3eu2…
G
And the worst part is that these companies will keep pushing because they have l…
ytc_UgwxcY2pF…
G
ai is an abomination. Revelation is necessary, its time to become violent. Viole…
ytc_Ugx0E0wCs…
G
"won't" replace list could have been better ? know ?
Jobs AI Won’t Replace:
…
ytc_UgwE2lZ0L…
G
Personally I wouldn't mind if we went extinct by falling birthrates. A few gener…
rdc_m9iosvs
G
Dear Mr. Hermetism
Thank you for your time and your thoughts. Oh yes Sir, I qu…
ytr_Ught0WN5r…
G
Imagine having nothing going on in your life so you call yourself an AI artist a…
ytc_Ugx5KEQve…
Comment
It is nonsense that we do not know how AI works. The system is designed by people and based on code, algorithms and data, the model learns.
If people decide to delete the model, that is what will happen and AI will not stop them from doing so.
As in the case of fire, it can be a good servant and a bad master. If someone decides to connect it to a nuclear warhead launcher or to experiment with connecting it to living organisms, it can turn out badly. But it has happened before with other systems..
youtube
AI Moral Status
2025-06-05T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxSQPxEYtntpecc9sV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEa_RlENsTK9iR_pN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxmg4I2da2p0XTYUNh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyO11tcft7STt5v1Zh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzvKo34F2yt-m0rKVB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwnDSTbK7YwicjLetJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyU-Dj6-R-fasSOkf14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwfqikkRdG6M-lf9ml4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgywwvKGgnTUBNTmW1N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwPkNCn9pXAsXa9O5B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]