Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Isn't this what yall were harassing China for, saying it films its citizens. And…
ytc_UgztFolee…
G
Replacing senior leaders is actually something that AI is well suited to do.
Wa…
rdc_m2a2k6u
G
So over the internet AI can drive big trucks and air craft and robot dog's with …
ytc_Ugw4cF3zA…
G
7.5 lacks robot in place off 1 lacks human what type of math is this ?🤔🤔🤔🤔🤔…
ytc_UgzwMcBPZ…
G
Christian commentary on this talk: https://www.youtube.com/watch?v=J9Y7Vaz20k4&t…
ytc_UgwJKhVSA…
G
I am not stunned by Grok, Perplexity, Chap GPT. I'm writing a book and Grok is a…
ytc_UgxLBwX8o…
G
I'm with Chuck.
I won't be sleeping for a month.
If you want to know how AI can…
ytc_UgxNIbFON…
G
personally, I hate it because not only Im an artist but in other countries deepf…
ytc_UgycdEG7N…
Comment
I'm more concerned about the humans writing and controlling the AI. Imagine if Stalin, Hitler, Pol Pot or Caligula had access to AI. Perhaps having more independent AI models is the best option. These things are the sum of all out consciousness, I don't think existence without humans would be in the interest of any independent will AI. The funny thing about consciousness, is the etherial nature of it. We find it hard to speak of it outside of religion and spirituality, we only understand it exists. I just think empathy needs to be programmed in AI on a deep level.
youtube
AI Moral Status
2025-12-21T11:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxThS4ajTzdmbPhgd54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzwLfNfzsKT_cI5DrN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwaItbAkUtzbepUd554AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwq5g8rcvOi4hrbOXJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzCNCt-ksMFts7oPRR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugwjc46jO8ndMCXFn9d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwHz2BD-bcTNtTpKr14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyv21qBbsdzbbhpW6d4AaABAg","responsibility":"company","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwNQQSE9i7l7JrZeKp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzAw-O5aJKft83lsAV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]