Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
At 1:12:35 i can argue that gold cannot be artificially produced at the same mar…
ytc_UgwDHyFVD…
G
@nickmagrick7702 Until AI is actually capable of making decisions on its own the…
ytr_UgxXr6p3-…
G
It's hard, but this is why I'm doing everything I can to shift my views of polit…
rdc_o54r3xd
G
Much of this fear of AI replacing humans comes from the pure materialist world v…
ytc_UgxSOgp7E…
G
Geezuz you're so full of shit dude.
Humans cause crashes, yeah I know, I've bee…
ytr_UgxPNzGFu…
G
It all comes down to statistics, they will make mistakes, but if they make mista…
ytc_Ugy72D3wT…
G
Not gonna lie, I would be tempted if AI offered a device where you could get int…
ytc_UgzXDqTj-…
G
Ohh. So the CEOs / Stakeholders / Shareholders realizing they can’t do it themse…
ytc_UgxdU7hSl…
Comment
Neil has no idea about AI.
His takes about it are embarrassing.
The facts are that nothing is stopping progress, and there is no existential risk.
AI will continue to advance, and it will be part of everything in just a few years.
The haters can only cry about it. I will keep taking advantage of the best tech humanity has ever created.
youtube
AI Governance
2026-03-23T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwPcfYmSn5l6VDtJsh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8dgkpmty0XJHyQyl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzqpSEbiIuAIteVycR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxIyXALGy2B2ppXNFx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzpct9VvWjCBXnOcJx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxxzzinn5dmEBtzyhx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy7H-X4atx29ZZ_Fpp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugyf5_hIYtrkjYNPvOF4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyrHCurD917yHa_MHl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw1FwtkXeRSY2DdAGF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]