Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Consciousness is in all things ?
AI is programmed in the same way that we are
I…
ytc_Ugy0ygaGv…
G
"Why don't you just Google for reference images?". Because the Google results ar…
ytc_UgwZJ9rJb…
G
“A fighter robot can't be bargained with. It can't be reasoned with. It doesn't …
ytc_Ugw28V6TF…
G
As someone who was naturally pretty good at drawing I can say practice and patie…
ytc_Ugwo71vxA…
G
@leeoscarmeyer9484, I'm sure the lawyers and legislators will figure this stuff …
ytr_UgxshCyB-…
G
Please, while listening to these testimonies, remind yourselves that AI is toute…
ytc_Ugyw0qMN9…
G
No. I remember when someone left a Siri and a Alexa alone in the same room talki…
ytc_UgyNOW7JK…
G
“I guarantee that in ten minutes your daughter died of boredom” it hasn’t actual…
ytc_UgysQOXaA…
Comment
Should not have been kind of hardwired into all of the AI that humans are what is needed for the AI and for the world to continue on?? I kind of check my car should’ve been hard program into AI about human life and the importance of it because the human lie And led to the creation of the AI. And we need more human to further advance that AI so yeah keep making them super smart but also make them smart enough to know that humans created the AI and maybe we have put a button somewhere to turn all AI off at once if it ever gets out of the end that only a human will have control over and again for humanity to keep going on band birth in life needs to be repeated multiple up on multiple upon multiple times. They should have been programmed, knowing the humans are the ones that created them, the AI, but also we know that we have a creator who is just, and the one that created us would not have created AI for obvious reasons.
youtube
AI Moral Status
2025-11-04T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwYMbnDM2VGwox_aOJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxYOycNbQ4IvjMtNMV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJTMhPNMrUXQM_uaJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxrgXxoJVinsyrkMsV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwbeUYUR7QWsH2Lz4t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxG5K3a3A25sztovYJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyhsNprkIeiJxp3n3x4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLBIjK-OVx0rpwkFF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzuvROVe4G8ECnVlfd4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyP_aecgiga2Y5SZLt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]