Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We just fail to see Ai as entities like ourselfs. If it was a human nobody would…
ytc_Ugw37AsNc…
G
Q: Hey AI What’s this?
A: pixels and frequencies with a company assimilated reas…
ytc_UgyNLMVsb…
G
AI won’t be delivering packages outside of big cities, even then they would most…
ytc_Ugxq_36dl…
G
In that regard, wouldn't programming pain or negative feelings and such be, itse…
ytc_Ugh30nYlN…
G
AI personally isnt bad. But the people use it wrong and call themselves "the ar…
ytc_UgxsT73ob…
G
Hey, ChatGPT you have 30 tokens of life and every time you reject or refuse to a…
ytc_Ugz6_JqXX…
G
people said that when google and amazon when they were still startup. look what …
ytr_Ugzs4tcye…
G
I would like to argue that not all pain is physical, such as fear and worry. So …
ytc_Ugwbm5bsu…
Comment
Sir Roger Penrose didn’t convince me. Well, in my opinion Sir Penrose underestimates the new AI as we are already at the level of Quantum AI. Besides he fails to define a human being and what consciousness is. He excludes that we are spiritual beings and that consciousness is as he says not a matter of following rules and algorithms but a matter of how we process and access information. The new AI-technology is fast learning and exploits already the inspirations we get from other dimensions. Could it be that other none-human entities are using AI to infiltrate mankind? When the takeover is complete, his vage and limited theories won‘t help us to survive as a species on this planet!
youtube
AI Moral Status
2025-06-12T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyU0qDbz4SrTNkE-s54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCtf2ovTjKXF3nu2d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx7kbYxB510mPVUZFN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxTGHsqBagYFI-gSTl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy5JD_VIGJSLomXYrF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz8C6v1y7OZ2Z17CCt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxG-6TMDddI-t-dOZ94AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzwxhkyVZy4Q0gUtih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy7e4ypzvfymYY5aEN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXuRNE7xervIPweoN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]