Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You apply for a non-driver's license. Regardless, I think people will still need…
rdc_dmovs59
G
You know the funny part people that say is ai art but I have one question. What …
ytc_UgwtVavwj…
G
10:21 The only way to keep AI from getting out of control and being able to be c…
ytc_Ugz6Bo20K…
G
Y’all are like monkeys looking in a mirror, thinking your reflection is alive be…
ytc_Ugy5C1W4t…
G
I think the timer is just a measurement of how long each AI took to respond, not…
ytr_Ugxq4PCkr…
G
Because the art was fed into the algorithm without permission. The lack of conse…
ytr_UgwGc-PAn…
G
Grok is a tool - its the users that are the problem. you can do that with any ot…
ytc_Ugxjpg0T_…
G
Corporations are using A.I. implementation as an EXCUSE for MASS LAYOFFS and has…
ytc_UgyzWQwcb…
Comment
The thing that most people seem to miss (though Geoffrey Hinton alludes to it) is that capitalism and the development of AI do not mix. If we want to be able to safely develop AI and continue to benefit from it, we need to remove the profit motive from the equation. If we want to ensure that humanity survives the universal implementation of AI, we need to remove the profit motive and dedicate ourselves to ensuring that we train AI in a manner that can minimize the influence of capitalism and the dominant narrative of generations of colonialism. We are creating AI in the image of the colonizer. Capitalist. Egotistical. White male dominated. And completely at odds with humanity, human nature, and the survival of the planet.
youtube
AI Governance
2025-06-23T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzOrWatAb5cM20OUbN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQ2TQM10TRrhgX9PJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_KWDgs88J_kX-MEx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzW-k8yKRLfnlyS63d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzVOIoHqJb-0HsMfwF4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx7NLxeBYc4gnTS9hB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwjgv8G0aCvTFNG0cx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzJWnUIcUaMwa7qcyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqjHVHHkm39R_wcaJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxEkFgTobeCXoaEeL54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"}
]