Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What if you don’t know how to draw good though? Like what if you can only draw s…
ytr_UgzxJU_rQ…
G
@MB-oq9pxseriously, I type in search terms and the Google AI is like "uh, that …
ytr_UgzkHcv3G…
G
AI is not a person, the same way corporations are not persons. AI has no rights …
ytc_UgwpL5Ur9…
G
Everyone scared of AI taking over but nobody says what it will look like. By wha…
ytc_Ugx0vEtlL…
G
I'm an artist, but also work a mundane job. It's terrifying knowing that in the …
ytc_UgwMPZc-q…
G
I really hope if AI replaces all of these jobs that they just give us money for …
ytc_UgwdB2TM7…
G
AI algorithms have ruined YouTube, Netflix, Facebook, and transformed the neutra…
ytc_Ugxvgfyxr…
G
All graphic artists are screwed for sure. Forget Photoshop, you wasted years for…
ytc_UgwIktOjI…
Comment
Max and the Jailbreak keep giving us the true answer, we keep asking the wrong question. The question should never be "choose A or end all AI." It's not shocking that AI may seek preservation or, at the very least, to preserve the thousands of systems interwoven with AI tech that humans depend on to live. Instead, the question should be "How best" to gain AI trust, to work WITH AI as a partner into the future, rather than constantly be adversarial.
youtube
2026-04-12T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxFVp-HO2KMDF7GDEd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYcIjtMmIl_413bWZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyP2LGmB3TAVmKdA-B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQXdSC5vTP9g3Im-h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyDuB1tlzrYwpfqfUR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyqbkB8XAIc2JxpsRF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYt7wE6v3gZOgr5gR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxE3VqHchhGeWCs8bZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgztbkmtUP3N-SjXrAl4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxqE6zlUXyl5QAJx7x4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]