Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai art is the representation 9f multiple immages
Art is the interpretation of an…
ytc_UgwlYAwfG…
G
Can I just say how unbelievable it is that a few people seem to think they will …
ytc_UgzC9FcXc…
G
14:40 im disabled and a bit of an artist? not really sure what would classify as…
ytc_UgysnnDQW…
G
@SlepdepOnAnAlt cause they don’t pay for the ai, it’s free and easy, eventually…
ytr_UgwET3D88…
G
All fine and dandy until something major crashes taking away with customer data …
rdc_n3mf97v
G
yeah AI is taking our jobs.. not because it's more productive than us, just cuz …
ytc_UgyI4ESOm…
G
I knew the whisper chick virus would spread to A.I! Looks like we need Aerosmith…
ytc_UgwCXtf6j…
G
Clearly the mistake was in not using DAN. ChatGPT tells you it can't generate le…
ytc_UgxkdUxHs…
Comment
All these execs and CEOs just giving fluff answers.
Current state is as the MIT report says and I doubt it will change drastically. LLMs are not going to lead to AGI.
Yes people will be more productive. These LLM based tools help with researching, brainstorming, creating easily repeatable things, etc. But do the billions and billions being invested warrant this productivity gain?
With AGI you'd be able to say 'Build me a production ready youtube clone' and it would do it without a human having to double check the actual code, because that would have already happened automatically. The human would test it to confirm the end product is ready to be shipped.
AGI would be able to truly innovate like humans do.
youtube
2025-10-31T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyTACwzEkYw7AXrnWB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzVUcpUiAEiLsuktNl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwNvlFlu0h1IJP_SOh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwwdJqJ-6KcxG6mlZt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyx5AbWBShRC3PlDrB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyoAfqoTA9CXvkiNTl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxcc-KzWn08O3cpdZl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxpkZQ_g9egFJ_Bbs94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwrhk7GGlBOypahFjt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzeJuSApkD3aJXvIbZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]