Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What I still don't get about this AI boom: okay it's definitely in the interest …
ytc_UgzmYMWhx…
G
I believe AI spammers raise the value of actual human artists. I'm sick of seein…
ytc_Ugxb-e4mo…
G
So the problem was the "glazed over the detail"?
That has nothing to do with AI…
ytc_UgzJeQABt…
G
On the subject of the "Accessibility" of art.
Im going to be entirely honest, an…
ytc_UgwMfeAps…
G
I loathe people who actively sabotage neat stuff. This is why no one can have an…
ytc_UgzyUVFVz…
G
Idk man in some time ai will surely take over because i was able yo generate cod…
ytc_Ugy86axP9…
G
AI is already superior at coding and almost as sophisticated as the very top mat…
ytc_Ugx-hvAec…
G
You’re spot on about the Anthropic CEO being biased and promoting AI in a way th…
ytc_Ugwkowvxn…
Comment
It's an arms race. We're going to need AI to counteract other AIs. The problem is, this arms race could easily spiral out of control. At a certain point, we lose agency over what we're creating. A machine that can weigh the facts, factors and make billions of decisions a second will not quite require "human intelligence". Just the ability to make decisions and corrections faster than we can react.
youtube
AI Responsibility
2025-07-08T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwPOSvpbUW4Ygjf0d14AaABAg","responsibility":"elites","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxHdlCzpampJWM9A5R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx3DK-fbShruAkidfR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgySAXiU0B6WY2AQwFJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzi6MeZJ4G0Ah1TP5h4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwG_DHDNgiZS8oiONx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyEauY7UnA6PUjRp1d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgylQdSo2ZRoryBkiJN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwB_jyAkYSJzTNkToB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyDYjsyQgrRK7tzC2p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]