Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Professor Hinton said AI confabulates like we do when we recall memories. Why wo…
ytc_Ugy1yl0B2…
G
People say, “We have to teach people how to keep AI under control’, shouldn’t we…
ytc_UgzVV2Apo…
G
Universial income...is you are a slave to whomever GIVES you that money....you d…
ytc_UgzCQZ82m…
G
Maybe the robots can start a sub reddit about how much they hate their job too. …
ytc_UgxDzRKw5…
G
I feel sorry for whoever thinks this is real 😔. They replaced the human with a r…
ytc_Ugzi94u3I…
G
We won't be here to buy anything, we are being exterminated by the few so they c…
ytc_UgwkjGpUa…
G
I've only been playing with AI for a few days, but I discovered fairly quickly t…
ytc_UgwocQGv1…
G
Said by the bloke who doesn't realize that Lavender gets more support than any A…
ytr_Ugykc9tsx…
Comment
It all depends on how you define consciousness - surprisingly enough, there is no consensus on the definition. Besides, we'll never know how it is to _feel like_ an AI.
From the moral perspective it seems appropriate to stop any further development and switch off advanced models, just in case, to avoid mass production of beings capable of suffering.
youtube
AI Moral Status
2025-06-26T18:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwDipyH41eYyQ21MxB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwyRUmCu7B6-hm4hUt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyxBZ5ZvyovYxEt5iN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzq0OXe9v7Feq4SLad4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw8ATkNLEOEPT4CPYp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzEG4dZPqzLGTY4pKN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxiqWHcWdwJgf76ODF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz8xDVGB6DoCvwa8Il4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzcbTP8KafveSsm3wR4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxHlVThxeyvEku4ot54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]