Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So, if AI is going to put so any out of work, shouldn’t we lower the retirement …
ytc_UgzNygvXn…
G
AI is not Artificial. It is Demonic Intelligence that has simply assumed the "in…
ytc_UgwITOBjO…
G
You complain about my erotic slug girlfriends but you 24:59 have something calle…
ytc_UgzESwHT_…
G
The term surveillance capitalism is so much more nefarious than it sounds. It ju…
ytc_UgxiOjbzE…
G
The AI poison being called “Nightshade” just sounded poisonous to me, and I fina…
ytc_UgwRuK6Fv…
G
I'm going to be honest, I have been guilty of both tracing over art others have …
ytc_Ugw508p_o…
G
Consider this, the world is filled with angry lonely people who haven't figured …
ytc_Ugw-1QDAK…
G
AI feels like a ghost in the machine, linking us to what seems like the essence …
ytc_UgzNFXUHL…
Comment
I see a lot of comments berating big tech here. I recently heard an argument claiming that it’s the wrong move to focus on moralizing tech CEOs like Altman or even Musk, and I’d have to agree.
The only bad guy is thinking there’s a good guy. They are just doing what anyone in their shoes would do, and those shoes will always exist to be filled. We’re fighting AI, an inevitable emergence in any technologically mature society. I do think that there is utility to our outcries in the moral standards that we set. However, it’s not Altman putting us in danger, it’s AI, and our own dangerous systemic dynamic that is likely inherent to any civilization’s social evolution. Unfortunately, I find that this take implies an uphill battle.
youtube
AI Governance
2025-05-22T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwmQMx2YXqMK9rAwqd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyAj1ODGdh9iwcsEaJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxh9XJdqNH909ewyr14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxahXa1KD7SWsO1tTJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzt4cId0Cya7suGIM54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzu_6S4UgMjf_nk9U54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw03Hy_8JEyyqKwZ-l4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBKr1k2kwXpxoa5cN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzqdwi3JV_V4kgmeuB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzgm57jBiilQTuMkmh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]