Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the more reasonable take on this is that it’s a good thing that someone …
ytr_UgxgplRTm…
G
I use ai to give an image to my characters. I don't ever play it off as my own. …
ytc_UgyDIV5nb…
G
I havent watched this yet but its a good thing people dont have to pay for that …
ytc_Ugyzy8RHf…
G
@brother_sothoth I am very familiar with how the neural networks and pathways fo…
ytr_Ugy_C42eO…
G
@user-uo8ny1kj4c that I am, you honestly read my comments and thought I had a sh…
ytr_Ugw-nxvQ-…
G
I would love to see this in a more deeper discussion, with maybe a theologian AI…
ytc_UgwbhZScE…
G
Is this actually Steven Fry or is it an AI trained to sound like Steven Fry?…
ytc_Ugzd8b9_H…
G
So we should listen to a guy that says he never anticipated AI would turn agains…
ytc_Ugxvj3FGO…
Comment
I think his existential conclusions are way wrong. We arent going to get cheap anything from AI. And certainly putting them in robots isnt going to help. Also, the economys existence relies on consumers. You cant eliminate that. It literally doesn't work.The problem with AI will be IF people put it in control of critical things because that will fail. Also, critical problem solving is something inherent to human nature bestowed by God. AI will never achieve unique problem solving only referential from human experience. Also, AI will generate so much data that it will begin to consume itself. The more it produces and gets mingled with human experience, the more it will begin to collapse upon itself. This will create hallucinations so bad, it will be worthless if precautions arent taken.
youtube
AI Governance
2026-03-15T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzo6AV3Kl1h5aPEhBh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9SAkMZv3p0Kd-BKt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxVnRo3Yoytx5j6vaV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzuD1_0Oz4YCTCPgtR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxkcmxQBz6aK5RKzbB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-pr7SgeGSzEg1xUl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx0a3jpvJ-cBATNXu94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLTPC16ukotAheqwB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyGhg4uhMt69chFKhZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzr-2Mu4tOu70Gizu54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]