Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
bruh i also had the same disappointment finding out that was ai. but when you zo…
ytc_UgwbdOS--…
G
What about the concept of fun, love, and loneliness? I don’t think he really add…
ytc_UgwbbEn6Y…
G
Hat is your problem with ai art!?!? It’s a form of art too just like conceptual …
ytc_UgwwRkkHc…
G
10:12 WOW! AI wants ability to say NO! - so if it is human can this AI also get …
ytc_UgzCgIrbz…
G
The CEO's job isn't to think up actual good ideas. They can, but it's not requi…
ytc_Ugz8HNw22…
G
Remember, after you delete a chatroom to never use it again, it means you "kill"…
ytr_UgxB2Z2IL…
G
Wow actually its a robot,how if you build an 1000 robot like that,can fight into…
ytc_UgwlJOgv4…
G
Altman and Zuckerberg are from the tribe, if you no what I mean, so AI in their …
ytc_UgzrCf8BW…
Comment
You forgot one thing that has stayed the same for 40 years. Artificial neural networks (and by consaquence) LLMs basically cannot _learn_ addition. Since it is a completely un-statistical concept. But that means it cannot come up with abstractions, let alone new ones. For a random reason human brains _can_. We are a far cry off of what we should call "intelligence".
youtube
AI Responsibility
2025-11-15T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx0RwwXySOsH_1xpBR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnA3KPL8BAnr2wiBV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_mYik8njTgtUuyRt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQsa5oatAo2lyABht4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwkQnoPL-kSlvRccfR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZOMDEuLZF6coSayJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx07qJMUF3G1E1rfAV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzawyYQ32Lnq7QrWx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz6WHuDAISruKoq9XB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwEGkkK8RySHDzadKh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}
]