Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This AI boom is very new and very abrupt. _How AI is abused_ to make AI slop is…
ytc_UgxeJePaP…
G
Came here to listen to what you say, stayed to watch you draw while speaking.
A…
ytc_UgwK2tQDY…
G
All the things you "supposably" referred to AI, all the violence etc, is humans'…
ytc_UgxynVg0V…
G
As soon as they hire a monkey to do this clown announcers job they won’t need hi…
ytc_UgxvAgZiO…
G
A very simple way a sophisticated AI could destroy mankind quickly/efficiently i…
ytc_Ugy6xDRR8…
G
Turns out is was humanity who was the monster all along. We pretend that our spe…
ytc_UgyJSPaJf…
G
Good lord, even I an out of practice foreign IP law specialised jurist with chro…
ytc_UgzETEZBh…
G
Generative AI under a RAG model, can be better than human. Because you can restr…
ytc_UgxufQ9_u…
Comment
Howard Marks is a smart investor. But he doesn't know AI very well: All AI based on neural network is uninterpretable and unreliable. We can only allow them to do noncrucial things automatically. In 2016, google deepmind had announced the medical image AI that surpassed human radiologists on benchmarks. This let to Jefferey Hinton's famous statement: we shouldn't train radiologists anymore. But 10 years later, none of the medical image AIs are usable and we even have a shortage of radiologists!!!!! The potential of AI is way too overestimated!
youtube
AI Jobs
2025-12-12T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxw3eTlGaZvcrt4y0h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyEGQvHqhijA1HQtU94AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzsSl12FDBo4nFL-nh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLOh-0dt1GZQkl4ed4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy4jPkHFt3lS6lJ_6p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZP0_leKmP_aCEbHd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyrzTyHnLPVU_NPVn54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHQkhjyKR_hTivwAx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyeYq5r3ITcusqslql4AaABAg","responsibility":"investor","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx16cHZZZ48dGbOLY14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]