Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So here's the deal:
When the government buys a gun, they can use the gun in any…
ytc_UgxtwyH1l…
G
Actual people put time and effort into their art. And then those AI people use t…
ytc_UgxstF2gX…
G
it's insane how much delusion you have to go through to convince yourself that u…
ytc_UgxZSh9HY…
G
Automation may work for line haul. But local pickup and delivery drivers for LTL…
ytc_UgzCkGCBZ…
G
If it's any consolation, AI is also surprisingly good at entry-level coding, so …
ytr_UgyHZq_99…
G
How about we restrain them based on moral compass in a way, that worse actions i…
ytc_UgzUqbBJA…
G
5 years ago Austin Macauley Publishers Ltd., published my book The Treatise of T…
ytc_UgxO6Vqi2…
G
AI is making more products. But for whom? Will robots buy those products? Where …
ytc_UgwUO_Xa_…
Comment
Considering AI managed by governments, we must hope that these governments work for the public interest and not for power, which is very difficult. The reality is that AI is managed by private individuals; once the majority of jobs are replaced, how long will it be before governments are blackmailed by private individuals? Imagining a world of unemployed people surviving on a universal basic income isn't the image of humanity with more time for itself, but rather the image of humanity in slavery!
youtube
AI Governance
2026-01-01T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgykSMXVgsNyxU5y2qJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMrMa0y9txev3uT_d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyPuxMyyz1XW5sX6q94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0AtOa036HOsCBaGl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwhxlHR3g2aQnHo9iF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw5UHgfSU4Z0qNjIWt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxMR5FS4Pe13oO7bqp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzUfridRRe5R0U52aF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwbL0sxqU91ladfDz54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyHrjtZIpFEHIByZ7x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]