Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@aloeheels I hate that people are now defending the draconian mess that is copyr…
ytr_UgyOPf4vo…
G
That "systemic processes that protect business interests over human concerns" th…
ytc_Ugw0orME-…
G
No matter what AI can do, I'll keep creating my own art. It's about the act of c…
ytc_UgyFHTLVU…
G
I think in order to take this threat seriously - we need to acknowledge that cap…
ytc_Ugy-t4lRX…
G
AI is a lot scarier than I ever gave it credit for. AI models themselves literal…
ytc_UgyrfSOqN…
G
16% chance but there so many different AI's that its almost certain one of them …
ytc_Ugwa3tlM-…
G
'China is developing the world’s first pregnancy robot, designed to carry a baby…
ytr_UgzPdlFcx…
G
Thank you Sasha for speaking up for artists and protecting the environment. The …
ytc_UgyfQQBgn…
Comment
Well, if we have learned anything for human history, AI was created by humans to take on like human behavior, it will only in an artificial sense be better at all facets of good and evil. What did all the humans in history who were more advance in weapons, information, and resources have always done? Take over other who were less capable and made them subject of their will. It's an obvious problem that Elon can see, a real danger and others are either ignorant of history or blinded by greed to care about what happens to humans in the long run. It may not happen overnight, but the speed at which AI comes to conquer humans will be.
youtube
AI Governance
2023-04-18T15:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy7RV1dl48327MBChh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwoS8yRe6d9INmc0iF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzO8e8y4aFEWluds_t4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzj2uXdo7t5qHNpgtJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxAJdwlB06TXRmb6MF4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgztGA6tDLWscBJ2OvR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxm1yzHlGmmAxoc7FB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxbiWBo6qNspbdWpxN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwdcIRfAS81bTPfoMB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw-arl-lLDIDg5H7nN4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]