Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Having a couple self driving cars is great but the more you add to the road the …
ytc_UgxCneDVc…
G
We understand that interacting with AI can sometimes give off a creepy vibe. How…
ytr_Ugxka50-m…
G
Lmao I'm loving the butter robot reference from Rick and Morty at the beginning …
ytc_UgzcXr4fZ…
G
To avoid a dislike as click bait, could you provide details as to occasions wher…
ytc_UgzTmf_PC…
G
If artists die, ai art dies since it will habe to use ai art for ai art…
ytc_UgzFdlvUW…
G
Imagine if the trouble makers in an area moved out; or had to focus on other are…
ytc_UgyigOlGE…
G
@kitchenersown No, it's just because using ai art for such things might come wit…
ytr_UgzwyiQ6U…
G
This sent chills down my spine. I firmly believe that technology will be our dow…
ytc_Ugy4aome7…
Comment
Problem: we are all going to be killed by greedy people who are building super ai
Solution: put people in charge who are less ambitious and more ethical to stop ai development
Same guy: wants to live forever, thinks that human lives are just a simulation and hopes his bitcoins will replace all the wealth in the world.
What if greed, evil intent and cynicism is just a by-product of lower intelligence, what if more intelligent beings by default tend to be more positive and ethical.
Then what you worry about is just a reflection of your own lower nature not the outside intelligence. Nature is already millions of times more intelligent than us.
youtube
AI Governance
2025-09-08T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwY349dkX9NkBhS-Ip4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxi1POKOR2Nq5_Pb7p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwdb8Hp5ZxVOcknt0N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxV6j5c0nvYtX02kx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwjscQthiA-s0E7HuJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxMTpwOmPNaRX53z214AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxbM1Se3dtlpMrs5Mp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGc7vCcv5EeAiTFld4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJGKLkzKiR60ARPHl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzoHML2NAmmgBNqtix4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"industry_self","emotion":"approval"}
]