Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well more interestingly apparently; because ai is used so broadly, you can have …
ytc_Ugw_v2GSy…
G
Sosialism is the solution as the godfather of AI says. Norway got rich of oil bu…
ytc_Ugw7oxJ3A…
G
The skin looks like smooth plastic. It's not that it's missing flaws necessarily…
ytr_UgxTjGXEq…
G
The conversation with the claimed AI is ridiculous, the AI was embarrassed AI ha…
ytc_Ugy28MFLx…
G
Oh shi… don’t look at my polybuzzai chats ir my charecter ai chats pls dont…
ytc_UgxSK5qL7…
G
Ah, yes, today's culture who has an app for that, trusts A.I. and asks Google ho…
ytc_Ugx0wXxuG…
G
Yeah, lmao. Steal data from artists? Sorry guys, it was free on the internet :) …
rdc_m9gjzl1
G
We appreciate your concern. The robots featured in our videos are advanced AI mo…
ytr_Ugy6ACW3b…
Comment
I think you're optimistic that LLMs will get "much better". Over the last couple of years, there hasn't been a massive improvement. I also object to the term "hallucinations". Making stuff up is what this technology does - it puts words (well, tokens) in a sequence that seems likely based upon the training data. The problems you describe also apply when it's used to generate software code. People are actually less productive because they wpend more time futzing around with the prompt and debugging what the machine spits out. Sure, companies are using it to replace junior devs - which raises another problem: where are tomorrow's senior devs going to come from?
youtube
AI Responsibility
2025-09-30T15:5…
♥ 11
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQk-muSb2r5fd_5zx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwlMorIq5orIC5_Rll4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyrNAW8hqMzMS9DRJt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzpc597Xf8mUgb4gMR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlfGLue_u8PwO5u9l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzdW_X4piqsDgEy0Sx4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpTTvpQAzTK_otGlJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyo8mZns9RZS7APme14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgybxVrbQlagUyLgjSV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyqismwU-rgY2NlkpB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]