Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI doesn't think for you. Also research is already beyond GPT5. We should call f…
ytc_UgwxdRmoZ…
G
I love how the guy at the end challenged you. He was obviously well educated and…
ytc_UgxlNY4h9…
G
He was worried about other people jobs due to AI
But he forgot about hid job now…
ytc_UgwOCg2Vj…
G
As an artist I believe AI should be used as a tools like when I need inspiration…
ytc_Ugxft4vuw…
G
The AI doesn't actually prefer something over the other since it doesn't have su…
ytc_UgysLwPQS…
G
"I need to go have dinner with my family"
"Is your family more interesting than …
ytc_Ugz5JU8Kg…
G
AI art doesn't exist. AI users calling themselves artists are just role-playing.…
ytc_UgwRgUe3e…
G
Corrupt senators like Bernie are wiping out middle class. AI is a tool, and when…
ytc_Ugym-mWvv…
Comment
A note on AI being something we don't really understand: I am a researcher on Cybersecurity in AI, and a new approach to AI explainability called eXplainable AI (XAI, yes really). There are actually a lot of regulations focused on AI being human understandable before deployment (especially when algorithms are used in critical infrastructure or disaster response environments). Being able to see how an input could influence an AI's output is very important in most situations, but no one can really take advantage of it because there is such little educations on how an AI works. Granted, software development isn't taught in schools but if people are going to be interacting with hidden algorithms daily, they need to be educated on this stuff, otherwise, AI will continue to hold a "black box" status
youtube
AI Moral Status
2025-11-01T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwqDZPwS0sJhzustSl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwfVIgjc9RUVbtK2Yx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwBkZ0RB2dzvKO0Wc54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyRx8kIRspv6bsRE4J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDgzcIUZXZuAzgHSR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzYdePcFg5OXhfaaV14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz8uz5IUjT5JCw33wF4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy3sz23nrUfIxdlCFJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwnojViKzl0G8CMj794AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyH4hSorqWq8zxU7AN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]