Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I heard its called 2 Hour Learning. I follow them and am looking closely to see …
ytc_Ugws_oQ9C…
G
@ using AI is in and of itself, a hobby, one which requires quite a bit of time,…
ytr_UgztAEH7u…
G
I don't think anyone can imagine what the future of AI will look like. I'm both …
ytc_UgzYd7VS5…
G
Please stop. AI/Crypto are mostly ponzi schemes propped up by foreign debt. …
ytc_UgzkAUhwz…
G
Should be laws to limit the AI. Can you imagine 10 years from now people won’t h…
ytc_Ugypi4j7V…
G
Self driving cars won’t destroy our cities IF Elon Musk can make a change in gov…
ytc_UgzAKk7YD…
G
Hey @AlturoJuan, thanks for chiming in! Your comment about the video really crac…
ytr_Ugw9747fS…
G
A robot tax isn't a big threat. If a CEO sees that he can hire a Human for $12,0…
ytc_UgyESDwJ6…
Comment
The idea that an AI could be "aware" it's being tested, or aware of anything at all external to it for that matter, is... so stupid. An LLM doesn't have senses. It has no objective (or at least, mostly objective) means by which to perceive the world, which is something all life has. It can only know A Thing if it's explicitly a part of its input. It doesn't even know where it's input is coming from! For all it "knows", it could be talking to another LLM!
youtube
AI Moral Status
2025-10-31T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwriyPCQf2NJ5xUW6x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxMwAJti6JgJI_XWe54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgxCBfJl2rYSom5_k8d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxHm0jPPpexUD2CjQJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwC7d6iEK2qRteZpsN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgyOzbTX3eTp9HG2su14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxouMGOY3QqvE5B0Kl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgzevhR1Tlri5VJR6t14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},{"id":"ytc_UgyOZ5i4NtVaTdI0DHJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwuM47SFqlrdRx7rLt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]