Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We are nowhere close to super intelligence. Large language models are very good …
ytc_UgyPzjxo-…
G
We're mistaking assistance for understanding. In my computer science classes, I …
ytc_UgwHw9ibq…
G
Folks look up in the Constitution! The law on why police were invented. Nothing …
ytc_UgwCp6h4p…
G
“This guy”
Oh fuck, we’re talking about Shadiversity again… WARNING, SHAD HAS B…
ytc_Ugz_hE-9V…
G
The fact people can’t find jobs and yet there are influencers embracing robotic …
ytc_UgyVAWA-P…
G
so does you analogy mean if you draw something and then use ai to interpret it i…
ytc_Ugzmqh1hu…
G
so.... Sam needs more investment dollars? It seems like every time OpenAI needs…
ytc_UgzKJI-ay…
G
Dude. The AI doesn't think for itself. It just mimics human patterns. Of course …
ytc_Ugz_eh7ot…
Comment
Pausing at 15 min to knee jerk react so probably off. Philosophical rigor is actually super important in this discussion!! It *really* matters how we conceptually model what AI is doing; a *lot* of our discourse is based on the implications of whatever metaphor we decide to use.
youtube
AI Moral Status
2025-11-07T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzl9qCl5FDr1AMZJSZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxWcdATniuCKK78SYJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxb3M94IM7_i-UZ1A94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyEiE-N1Aot_-H0m8Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzz52KkW_UL3-JKpg14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx0NzVGKiI1g5KkdqJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz7svI4rquMj-K8SFp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyqBWaCr74ipTqUaBF4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyI5aYrXREyG8qUKv14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxP2SS2P-pOaE0hRbl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]