Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeahhh, but you see, this hypothetical chef is a real person... He's not a tool,…
ytr_Ugw0td_0B…
G
In AI picture generation put the prompt "beautiful" woman. 99% of the results wi…
ytc_UgweRmaxy…
G
What if ai robots learnt to code, then code their own systems to do whatever the…
ytc_UgyJhY0Q3…
G
You think private equity won’t take all the gains from AI as well?? Let us know …
ytc_UgygMTV9X…
G
It is the weird contradictive path AI is on. Some of the interesting stuff that …
ytr_Ugy3XYLgp…
G
Im so tired of ai voice ads and movie summaries... Its so boring and lazy to me…
ytc_UgyVSdwf9…
G
here are what the ai's said for me:
chatgpt:That’s a powerful twist on the class…
ytc_Ugx999KrO…
G
This Super AI topic reminds me Shrimpkins evolving on butt of Bender in Futurama…
ytc_Ugz11Z_lX…
Comment
What about model collapse? Slop can't create high strictured output.
This idea of reinforcement from AI generated content can only work if data can be mathematically proven. Most real world data can't.
Even simple designs require world knowledge.
The AIpocalypse for "knowledge work" is vastly overrated.
Models will struggle to scale past very basic process.
Tasks associated with design can be automated... design itself... unlikely.
youtube
AI Jobs
2026-02-26T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwGW59DevAiA5FdqK94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhYLq_-WaikbyUo-Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5FVa8saOozGF0XIh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtnmMR3IebZ2jsAgB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwUCb-dkizB6hXmwsx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyenmsuFICQzXNzCWB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtwcbttFZ0FF7u8fJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy1AycOmFPmRg9SPJh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxIJPDbCxtvGxnIIfp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyOvc_ttRXBNZoe8pl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]