Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Current models can master fundamentals of art that takes years for humans to lea…
ytr_UgxhmNalX…
G
As an artist I think Ai can be both creative and destructive, creative as in hel…
ytc_UgziC7zgO…
G
That's the point. Its a 1% gaussian blur or something. Not enough for our eyes t…
ytr_Ugx8QX3NS…
G
You're saying they can't know that will work, which is correct.
You're also say…
rdc_kvdslvc
G
So far Ai Sofa has done nothing wrong so why making these videos,to be on trend?…
ytc_UgzmdLUq0…
G
The ai artists shouldn’t even have ‘artist’ in their names.They really can’t cal…
ytc_Ugw9piOk-…
G
People talking about robot rights when insects still don't have rights, maybe th…
ytc_UgxSBvlvW…
G
Those stats are fake, some failing companies are just being added to AI replacin…
ytc_UgxHTEFJ2…
Comment
I think capitalism’s (especially neoliberalism) existence and humanity’s thriving are mutually exclusive once a superintelligent AI comes into existence. We will either need to shift into some sort of economic system where one’s worth is not tied to their productive output, or we will see humanity dramatically suffer.
I worry that the rich and powerful care more about their wealth and power than they do humanity. And those that recognize the existential problem, also probably have the hubris to believe they can control it. We won’t be able to control something that is orders of magnitude more intelligent than us. It’s like believing that ants could control us.
youtube
Cross-Cultural
2025-10-01T16:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyL0FpuHAJ_SOf-9h54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyTUnEHuKi9_AngNSx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwpKjDzGzRm66utiuF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy56j8mLXeSQjyHjmx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwEt57TVstQJppsZ514AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyFnabr1qpOSV6v3gp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxv9dRsVVDcctYgWe94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy684fhKYuFkUP7ySh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgweKuWFdST3E64bvpV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgySq8zm_6_h7fMhqgV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]