Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don’t see why I would pay you for art that looks just as good if made by AI.…
ytc_Ugx-1NszY…
G
👽 welcome to the Undisclosed Forces Option Alien Megastructure CMD Human AI In…
ytc_UgzqXdXgw…
G
10:35 limit military ai? Right now AI is bombing my country, where it decides sc…
ytc_Ugwe24t_f…
G
generative AI is theft. and I think it will fade and become less popular like cr…
ytc_Ugxi7-ssx…
G
Where are the producers of silicon valley? we want a reboot. Also, can you guys …
ytc_Ugy6e8mv6…
G
As long as AI companies monetise creation of content, they MUST pay for trainin…
ytc_Ugx5F0p1Y…
G
The best take I've heard of AI is that if it comes for programming it will termi…
ytc_UgwOc_WW4…
G
The thing is, these LLMs arent actually inventing solutions - theyre taking them…
ytc_UgwFv6Rx3…
Comment
As a programmer I can ensure you that you don't need to have feelings or anything in robot for it to have "consciousness" (under consciousness I mean ability to process what only human brain before could, for example, understand everything it sees), but also it is really possible to make robots have some sort of feelings accidentally, some man for example accidentally made perfect simulation of gravity of some solar system trying to make colliding system, accidentally making AI feel something while trying to make it understand things is definitely possible
youtube
AI Moral Status
2021-05-03T15:2…
♥ 10
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzY9MQI6DX_CCOFXb94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzKJIEuhxY-PXo3Kkx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugxhqe0SG50DCpK3xo94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwG83mfjMxmSl1KXVR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzgCs0eAsqI8ZYDNHR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwuyJDKroN6U71Mni54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxfdAbhMkl7VF7nId14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx25490C9O0istXlBx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzNlya-f1wMwjftKcp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw1GgJxNzYvlH5bWJR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]