Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I really don’t believe I’m very admired about you guys cause I know what you are…
ytc_UgyRLD1iu…
G
It’s what corporations want of course it’s going to pass, why do you think they …
ytc_UgyXRIb3c…
G
Universal base income is literally the only solution. Automation isn't going any…
ytc_Ugw3x2l3-…
G
People believe politicians who are not nearly as convincing as an ai who can mim…
rdc_mdj63v6
G
Building their own doom, and they can't stop because...it pays? Has anyone pose…
ytc_Ugxqo0_q6…
G
Yeah, I feel the same as you. I've been dabbling into Stable Diffusion to see if…
ytr_Ugza72ei4…
G
Even though they don’t necessarily do it to spite artists or mass produce anythi…
ytr_UgxUv8nqY…
G
I knew I wasn't going to like what I heard in this video. I've been avoiding AI …
ytc_UgwX1RlSb…
Comment
Obviously AI generated so it hid the most scary possibility known as the 'paper clip maximiser'.
An AI given the (simple and harmless) job of making paper clips so it keeps inventing and implementing new AI tools to turn everything into paper clips.
Finally the whole earth and everything on it (including us) is a huge cosmic pile of paper clips.
youtube
AI Governance
2025-08-12T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyVbUkSskYy9NHJAM14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxPdXXk_F2kh5CqoR14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx58rWy72FDFpASMWh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxTRaQzBVo_LSj3UMB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwb9B5BqFjDF7TY5CZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOJCKuPzN2DB7QaDR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwI4MAacHt22JQGWUt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"unclear"},
{"id":"ytc_Ugy1fTA35Nuu5h6T7-N4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxI9sirke1RG0oy8dV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx3GhKot0cNPD7mYCd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]