Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@justagirl6761To be fair, when companies already had a person working on soull…
ytr_UgwsxzSHS…
G
Ever seen an elephant? One does not simply sneak one anywhere. They're too big. …
rdc_erakdni
G
👽 welcome to the Undisclosed Forces Option Alien Megastructure CMD Human AI In…
ytc_UgzqXdXgw…
G
Not really so scary. If AI does nothing, it will soon be smarter than the smarte…
ytr_Ugy1eyLEv…
G
Ai's polluting the planet with brain farts, and this is essentially the extincti…
ytc_UgxtyuqJj…
G
It's hard for me to imagine AI replacing software engineers based on the quality…
ytc_UgwMFLSu3…
G
@mikebrach5412 Well, if the fact that police contacts a person makes criminals t…
ytr_UgzyGgIIf…
G
AI already draws really shitty hands. You don’t need to poison it to do that.
…
ytr_Ugz0P02wk…
Comment
The world will probably still run on money, and if people don’t have an income, how exactly does that benefit anybody else than people owning selling AI programs/machines and the people owning automated manufacturing businesses? Genuinely asking, I’ve seen him say that a few times, that AI will rid the world of poverty, but can’t wrap my head around exactly how this happens.
youtube
AI Governance
2025-09-30T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxbxmSFh3LfHyrZhn94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxlGD5Qc27d3NIfpBh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzJ6rL5s012LopURHN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyKUKYvVDn4Sh4Evp54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxZwR_Tn2giQiF3THN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxmBz8i-K5OqsnowVd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzvDND9-XuWWyVFazp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyeXZp6oJ8KMOJKqLx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxgidO6decsGn4k1YB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxAqaFEqOxQ-qMJnd14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]