Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Learning academics using ai apps is bound to lead to some sort of issue. Ai freq…
ytc_UgxkWPW2t…
G
I guess artists have a way to fight back, I think that the only way in that AI c…
ytc_UgwxX0QWQ…
G
Hey, be polite and the robot overlords in the future will treat ya more kindly. …
ytc_UgyaORaV8…
G
If the robot is a negro robot then yes it deserves to go to the front of every w…
ytc_Ugyd4pE6r…
G
I hate all of these videos I see on youtube shorts and I can't tell if it is AI …
ytc_UgwNlavi9…
G
Do people making these AI media not understand? When you feed an algorithm movie…
ytc_UgwV29wzO…
G
Cuando estás máquinas cojan conciencia nos reemplazarán y tomarán control del mu…
ytc_UgwPsP876…
G
The problem with this is that because of the nature of AI image generation (That…
ytr_UgyHXYVCG…
Comment
1) Humans are flawed creatures, saved in spirit, only by the sacrifice of Jesus , saved but still flawed 2) Humans (flawed created what is rapidly becoming a false God. 3) Ergo, human created AI is flawed. 4) The bible shows that God takes a very dim view on false gods and idols. Humans will once again shoot itself in the food due to hubris and flaw. The people creating this Frankenstein's monster think themselves masters, but no longer a master any more than Mary Shelly's books creation. For me, its not that I don't understand this neural technology, it is that I DO understand it. People at work chide me to use it - I just smile and stay quiet. 🙏
youtube
AI Moral Status
2026-02-10T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxYedwSAlk2wYb-8yd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhMlCZmzQIsiv5WLF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwerc7HFdYtcx_2whp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzleI4yRwUw7NXFsYR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwMxLC86tuz65BZVcp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyjJTHOS3qSCI1k4zh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx72qeGTjeSWIwpM6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzoQjrXPLWbT_u85dp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy5xHrwrg4tuU3RNYR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwUNp0ANgtVi13TZ914AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"})