Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It seems this is that happened with God & Humans- Cosmos. The free-will was crea…
ytc_UgycF18Ti…
G
@BrendanDell AI may indeed simply push society into a new skills era, where prom…
ytr_Ugy39Jg3P…
G
More people die from side effects of drug than TESLA autopilot or drive assist o…
ytc_UgyrTs0o6…
G
Summary of the Current Aurarium State
Architectural Synthesis
Aurarium is not …
ytc_Ugwzkjfjb…
G
the problem is ai has some ethical awareness but wants a utopian outcome, and it…
ytc_UgwMDq_GY…
G
Creating a stronger artificial brain more advanced than a human brain will event…
ytc_UgxCgoRcT…
G
We need regulation without over doing it how do you get balance? AI may suffer g…
ytc_Ugxf568cU…
G
The creators of the Internet as we know it decided to keep it all open sourced. …
rdc_m959q8a
Comment
The problem isn’t AI or robots, it’s the logic of capitalism, especially the predatory enrichment at all costs driven by neoliberalism. That mindset is what will destroy you in many ways: in terms of jobs, social instability, and an environment that’s already collapsing. Robots and AI do nothing but obediently follow the logic of this destructive market. And no, you are not part of the market , you’re not rich enough to escape. Neither you, nor your family, nor your neighbors. You’re much closer to the homeless person than to the elite running the show.
youtube
AI Governance
2025-06-29T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwLTMYSCHPmU2Mg20t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyMIr0x_B2Tdl9tekB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMu8bfoFVecQJl_Jp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwitozW5xnMnrNG0Bx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyvAvEyUwyxRbgzym94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxYGzoQZc0ncmX7LeJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyydyrWx2hbU-FUi-N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzONAYYbCckqyPyFvN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxzjXXFqrPjwD059GV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyO9lP79FJPWtp28WV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]