Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Right but a wheelchair is a wheelchair, AI is like having a custom made track to…
ytc_UgyTpKYzp…
G
There’s no real debate left about whether self-driving trucks are, or soon will …
ytc_UgwWYOXvu…
G
What the fuck is an AI artist?
He just typed A bunch of words into a computer
I …
ytc_Ugz_5X5qZ…
G
Some generative AI are free, if it's not about quality or using own PC to genera…
ytc_UgyXtStQe…
G
Is it unreasonable to assume that Mark Zuckerberg saw the impact and political i…
rdc_m5nmcox
G
I agree with his political philosophy. It's all game theory and so this is why w…
ytc_Ugxfc1YH9…
G
Given what you express here, it seems your feelings about ai art are all rooted …
ytc_UgwYSdfuD…
G
This is a very played out narrative. I use AI frequently and I don't really see …
ytc_UgwlFDosZ…
Comment
As a physics PhD, I haven't seen AI even doing post 1st year college physics/maths without spewing complete nonsense. Even for writing, which is where LLMs are supposed to shine, it would constantly give nonsense and often wrongly factual statements. For me it was only useful to help find typos or structure my arguments.
As a software engineer, LLMs are good at finding solutions that everyone knows about (again going back to 1st year basic knowledge). But when it comes to good design, architecture of complex systems, things require actual reasoning, it fails to deliver.
Apple showed LLMs are incapable of reasoning, so we need to stop thinking that they can.
youtube
AI Jobs
2025-10-04T21:4…
♥ 35
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx7ZhKF0kThfmD-jL54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwHQ7lRiQYuFuPD5FZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwmjtQ-8muETBMaZ-l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxGgksItfMFliSI1nZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwremSdYlenQWerS3t4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzrVhS48zO_9pLb1vh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgytYNN60wMnoX0FvZB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9bUfIvkZge4qEvSJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgznXHf_7Ob3tLwgukB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugww7fv7yM6pD_D-EvN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]