Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not really, humans are still more dangerous. AI would probably be more sensible,…
ytr_UgxI0kV5P…
G
I only hate AI art when it has time do with money, like if the 'artist' is selli…
ytc_Ugw31m4Nl…
G
ai will wipe out the working class and that will be great because those who dont…
ytc_UgxxAt9dL…
G
some do, some don’t, but Winston AI actually does a solid job spotting if conten…
ytc_Ugyj_8suV…
G
The more we use AI the dumber and reliant on AI we humans become. Our demise as …
ytc_UgxsY9DPd…
G
Because they’re trusting a robot, that can’t see or actually understand what it’…
ytc_UgyTqbe8O…
G
The way I see it AI might end up helping itself and then come up with one heck o…
ytc_UgzSxszhb…
G
finally someone who understands that they are statistical models. I also think t…
ytc_UgwEtlClq…
Comment
The great thing about opinions is that they are like ass-holes, everyone has one and most of them stink. I have said this before, when you remove a job you remove a taxpayer and governments love taxpayers. So even if the technology is able, governments will never allow it to happen unless companies are going to start getting taxed at a much higher level and we know how often that happens. Also keep in mind the only people telling you this are the people developing AI.
youtube
AI Governance
2026-04-07T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyEC1LKTuYRZr_0I_V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwuDluXcgFJm22OaVx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyUR5ePj-znxaPbOaZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx-WUd01SaIH7yGOW54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzhtWjVzGeK6QLLGqF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFZfpfCiZGF2E7v754AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz91DQJgWc2HcULoLd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwZZJtneETSlTus_-p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxxtX4sLqB0QYN4DOx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy1gaBhNUmJMo2x1Yt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]