Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Though when I talk to my chatbot to help plan or give advice it doesn't much bet…
ytc_UgyrDEnbG…
G
I have a car with drivers assist, didnot cost $10,000 and does pretty much the s…
ytc_UgzWltwy_…
G
i dont think that its possibly to make a robot that has feelings. we can program…
ytc_UgyTocf-K…
G
Remember the AI learnd form all the human knowlege on the internet so intern it'…
ytc_UgyBOQCfo…
G
IF you are going to properly compare Waymo with Tesla self-driving, then you nee…
ytc_UgxNjrD7g…
G
16:20 This was the clutch point. It's our spiritual state. Religion has made a m…
ytc_Ugz1JqBuD…
G
Ai better crash before i graduate school because my parents will not let me purs…
ytc_UgxZPgbXs…
G
this is happening a lot faster than i expected, soon there will be more AI cars …
ytc_UgymIl5ZN…
Comment
The implementation of the bea*t that is AI here on earth at this particular moment in time was planned thousands of years ago by the malevolent and psychotic race of human extraterrestrials universally known as the Watchers aka Maldekians aka Anunnaki aka Irinim aka Raqib aka Deva aka Neteru aka Egregoroi aka Tiwar aka Tuatha de Danaan aka Shining Ones. It is not a human invention or accident...
youtube
AI Moral Status
2025-12-14T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwSapLRfxZc2aDJ8tR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxTq1Boru0PXMHo5lN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxvZf9DclKP4fvKh4h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxsNZ_WOh_3oe8FWcl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyjS5yBjG0dVaoRKn14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxQL2QUDBiOKBBBond4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxbeMw2FfPcs99wRxF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwyrIoUTw1PY88mdDl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzUdqCMPucOO57NL8B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyoB5lwt8raCj54NY14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]