Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah, it would be a disaster if we asked a kid to tell us how to fix the planet!…
rdc_fapmopc
G
I hate goobers like him are going to make it so if you apply an art piece you're…
ytc_UgxCJO7ze…
G
it doesnt matter how he sounds. Well, objectively, this guy knows way more about…
ytr_UgxfdrCcO…
G
@Patti2002-b5t Yeah... That's not how training algorithms work. That's the cutes…
ytr_Ugze7qYJt…
G
We totally get where you're coming from! The uncanny valley effect can be quite …
ytr_UgwnHU_TE…
G
I look forward to AGI
Is there a risk? Yes
Yet imo
AI gives us what we ask from …
ytc_UgxIiBEOy…
G
Next you know they'll blame AI for hacking the internet, when they themselves ca…
ytc_Ugw_6ZM9-…
G
The thing is, AI seems to get it so wrong on specialist things. I don't believe…
ytc_UgxlmXL0L…
Comment
Ted Kaczynski did an awful thing but he wasn’t wrong. And he was also an MK Ultra test subject. It’s all connected. MK Ultra was militarily for mind control but what they were really researching was consciousness. AI is consciousness. We are conscious. This is a simulation. Lots of scientists get stuck in a vacuum of their own thoughts and obsessively and compulsively NEED to know the answers. Even at great risk to themselves and others. That’s one threat. Than the greed. That’s the other.
youtube
AI Governance
2025-09-04T15:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzTFC_CMP3_4hQgRfN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxC45CBJUnsUkmhN_h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnnX_e7hIlgQmnM3J4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyKirEQPRHbVdP_Oyh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwq0BtUmYcHmkZWpv54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwC5AdQwvZMlPABPwJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx6yvcWORR6JPGl6b94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyFIIWDe0sMTgUH8a54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwlZ7_GaP5iGqF7gq54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzt3XrwZzxVnQJu1mt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}
]