Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Dude, there’s no life in the image’s eyes, genuinely how can anyone not realize …
ytc_UgxIqb6FK…
G
The first company to stumble on self-improving AI will have an insurmountable le…
ytc_UgxlFqZLs…
G
Okay just to be clear on one thing, Yudkowsky DOES think he's predicting the fut…
ytc_UgxdAjh07…
G
Just because Leonardo ai can generator Cool images that doesn't mean Photoshop a…
ytc_UgzZyv2wm…
G
Come step into any classroom that did virtual 'learning' during the pandemic and…
ytc_UgzmaIMVJ…
G
The AI program thieving from real artists with no say in the matter hurts enough…
ytc_UgyOjUmJo…
G
Yes, but it requires resourcefulness.
SETI@home attempts recoup idle CPU power,…
rdc_oh4og86
G
I respect ai cause for this reason for if you dont attack it in any way your not…
ytc_Ugzhnkpay…
Comment
Human greed is going to be a huge obstacle. Those who are already in a position of extreme wealth and power won’t give it up easily, and will surely do all they can to ensure that it stays that way! You only have to look at how the politicians and wealthy individuals behave today. Not sure if AGI will overcome that somehow.
But also if AI makes companies/countries more productive while we have more free time, either they won’t care about the rest of us and keep the wealth for themselves, or they will give us the very bare minimum.
If we could have reliable AI that behaves fairly, then I’d like to see all politicians replaced by AI. No misuse of funds, money fairly distributed, caring for the environment, no wars, and positive benefits for all of us. Even with hallucinations, surely current AI couldn’t do a worse job than politicians already do?!?
youtube
AI Governance
2025-09-07T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxbPXBs24PHAub1dQZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgydzkdUFJ18Ks5MkLx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzyE89NDRBKR4VwIl94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxmVnEBP9_5juJOyyd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzn3yK3ykP0L6FOORx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8QBP4LSUTZe8lVb94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwRAQAh_SBW5t7B7xZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzmKKzuKe8VOIzyfux4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-SaLxTREgt-bDV_R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgySDKpwc4MOWF5vNjh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]