Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, since modern women are obsessed with looking fake with fake booties and fa…
ytc_UgzGro-H2…
G
You know what industry will also be replaced by AI? These overpaid McKenzie and …
ytc_UgxzO30RC…
G
Your art from when you were a beginner what are you showing off that's like 100 …
ytc_UgxOh9WX_…
G
Most of the "AI designed to drop misinformation bombs on our heads" is already o…
ytc_UgzY8StKi…
G
Why are people asking Elon Musk about what his kids are going to do he's a effin…
ytr_Ugy82S_Dg…
G
I’m not the only one who treats ai like it may one day be sentient? Good.…
ytc_UgyyTC-HT…
G
Seriously, chatgpt writes really convincingly even when it's absolutely wrong. T…
ytc_UgydEvA4a…
G
The first “animation” looks so stiff and the second looks like that simply draw …
ytc_UgwTCu_vP…
Comment
The key point that almost no one makes is that LLMs do an ok job at generating code for problems that *humans already solved* and there are 100s open-source examples for. But, to solve new problems, we need a completely new "AI" architecture that actually understands concepts. It'll probably happen in 10 years, but we are nowhere near that.
youtube
2025-03-15T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxchuzqHJO_RgB7-iN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxY5HHknInbPqO2aEZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyk7uBJS0CjMftHpEJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzRnXJldKQedztZoG94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwpfeMy6_IEuZ7PjgN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJjNOAEHt94xGETBZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwvVGcWKUcEe5ILRjl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugygkcj0FqSgy1r5wBh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzyZ5e8JtIrkHO102Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx3Uexmhpji0pu262Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]