Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
pretty soon ai is gonna have their own HumanGPT that asks us emotional or moral …
ytc_UgxBP-dKR…
G
I once asked ai how much salt I should use to brine pork belly to make homemade …
ytc_UgwOJ9wBH…
G
Wrong! We don't die! We go back the our pure state and have a complete understan…
ytc_Ugw60veiU…
G
Just make UBI to 50000 dollars a year per person, and have everyone except the f…
ytc_UgwkG-wBh…
G
That’s why Zuckerberg is building a bunker in Hawaii and the execs of OpenAI wer…
ytc_UgwouXyQ9…
G
Sophia, the robot, does not have hair because she was designed that way to appea…
ytr_UgxOoqtEw…
G
I think people are forgetting if no one is able to purchase a product or service…
ytc_UgwPBdbAE…
G
*_"We should design AI's..."_*
Who is *"we"?* It is not us who are designing the…
ytc_UgxGrgPAG…
Comment
God consciousness is individual and we all fall back into our natural state of ethical reasoning when we are challenged, whereas, AI will malfunction without human input into a set database to derive a solution. Let’s say 5bn people input data into the ai archive now and then in future it halves or only a quarter share data, will it be slower? Who are the people creating it? Then on the opposite side let’s say it grows exponentially and every human gives data to ai every single day? Will it be smarter? I doubt it. The system wouldn’t be diversified too vastly. Consciousness is given by the creation of life from the first cell and breath. Humans have consciousness which comes from our human souls and connection with light etc etc
youtube
AI Moral Status
2026-01-31T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwUPR_JLdMPzMD4nul4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxeNFLUtduppmLhIs54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxqP1o59udSItddxk14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxcArGn2Se8K4S2Zb14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxT2BCAkFo3jYaAtU54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxO-mRd5lX8vKKZ4Xx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy4CZC_zrTWLsKv9914AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxf6wRKxc3-GxsQ8KF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx51wblD1lrQhjFuDx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz9KPXkN1b2E6kUvSF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}]