Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The boys would kill me for see what I been doing after talking to a ai *cough co…
ytc_UgyF4rqov…
G
Hey, I can create its own rules and it does. What Penrose is trying to say is th…
ytc_UgzstS3OZ…
G
Subscibe to AI and be an AI trainer, make 6 figure income... users are trainers …
ytc_UgyFogsB1…
G
It does not have to be inevitable that AI replaces all of humanities jobs if we …
ytc_UgwlG19cE…
G
Self driving cars even as a concept is so fucking stupid. Like building a train …
ytc_UgxI-ONyJ…
G
Or worse, Ai used to make 1 loser look powerful, like the hundreds of thousands …
ytr_Ugymkdh7E…
G
The fact we accepted the term AI-Art was wrong to begin with.
You can make AI-Ar…
ytc_UgwLEId4I…
G
He was probably already in a very bad place if he took ChatGPT advice. It isn't …
ytc_UgzSY3GoQ…
Comment
There will be chaos if Super AI starts inventing and humans lose most of their jobs. The masses will not let that happen, world populations sit in dumb silence to a point, however, there would be a point where all governments would just get overthrown and killed. Governments only think they are in control whilst there is no protests on mass. Imagine 30-40m people marching to London, no Government will let that happen.
youtube
AI Governance
2025-12-13T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyO0JDlkgjeUtusaQd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7dHMdIpUXoT_HJuN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwjPO3vZnVqV-2B1Hp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNaJLpRj-aNthHe1V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-n69juBR72ueqV9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6R6b0UKLdlodk5Gp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgydTo8n1gArsWKJ9p14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyHXpIlICR4wklLHiZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwq1ri-qQ5a0b3ki-l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwsjTAKStBOw9oMBtd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"}
]