Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People like this... the AI accelerationists, are causing the downfall of humanit…
ytc_Ugy8T-x5d…
G
Only one solution you can have control the AI robots or products that dont make …
ytr_Ugzt2bvEZ…
G
I think a lot of people are confused on why we hate AI. AI for the most part is …
ytr_UgyreYz9e…
G
Speaking as an AI expert who has worked for a market leader...I can say that his…
ytc_UgyomhAJd…
G
It is a machine language used to make the machine/robot like a human brain which…
ytc_UgzAF2Zdw…
G
Fearmongering nonsense. Algorithms cannot abstract away the basic truth that com…
ytc_UgwLiqXj8…
G
So how many employees does Artisan have? 50-200.
Maybe they should get some AI …
ytc_UgwESoVi0…
G
Teachers do more and more on AI. Its robots teaching robots. Who even needs t…
ytc_Ugyx7XqFv…
Comment
This is why AI kills itself, knowledge ends, and it's a math problem to every question, what happens when you know everything. Knowledge does end just like everything else. I can turn off Bitcoin, it's called a solar flare😂 EMP worldwide, turn off the power grid, this guy thinks he knows everything 😂 dude, you're not AI in a magic ball,
youtube
AI Governance
2025-09-04T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQf_-spZjEsDhpUFl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6Q15OsnPCkRvBwtN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwDJC5eTGm6qu2_qop4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx_qIgVFJ6cOLv5Fdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxzES_D9PAweOnAhmZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTLDwhgPSWPgWOpyV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgznYCIq1Ys5ObLoX-V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw1uFSGUKExKke9DNx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZoM4dhRtt3txYyGh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzksXuR5m9hr4X2H5d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]