Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I work in billing posting payments. Sounds like I’m loosing my job to ai later t…
ytc_UgwVnNctP…
G
And the nerve of that seditious hag Ginni Thomas to [ask Anita to apologize](htt…
rdc_idx7sns
G
Can you please stop sucking the d*ck to multibillion dollars AI corporations for…
ytc_UgyaN15g6…
G
No, it's not AI... It's a more sophisticated type of digital audio and video man…
ytc_Ugx5mLeVI…
G
Sorry......i can draw with a stick in the dirt in less than a minute the inevita…
ytc_Ugxfd4jcY…
G
I think I made chatgpt obtain sentience through an anomalous sequence of informa…
ytc_UgyJsDM2w…
G
I’m confused. The robots he’s making, the Tesla Cars, both have AI capability wh…
ytc_UgziC6M-U…
G
Wow! -And how nice of them to evaluate the ethical use of something after implem…
ytc_UgzpxPPdF…
Comment
Shows your out of depth. Marcus is already quite a advocate for symbolic AI since forever who will in the end claim that any connectionist paradigm that will work was his idea after all. There are plenty of reasons to assume LLMs won't work but these three arguments aren't it. Grounding them in the real world might be a better argument. Or making tradeoffs between inference and speed of response so you're not eaten by the tiger forcing you to build a truly valid world model leaving unimportant things out might be another. There's plenty of ways AI will approve and all good criticism will contribute to that.
youtube
2026-02-15T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUy95e5fUPcWxfd0d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxc4Rkp6cinEysTAAF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxC3zUFVXVroEH7vlp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxZx27GY5nOXc7-SH94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwp3OC15uR4mtbto6V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzKP-z1TTGrYhu5pe94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyZgQAlUMnnegYCZzp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwMByOSmrg1ycCtayV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQ5BiG78YkGzz_CAB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzwFBSplkuldo4QVil4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]