Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@akeemmorrison2589 that's what i mean. Someone messed up in the code. If "Hello …
ytr_Ugz9LVMtL…
G
While i agree that AI art is not property of a prompter, saying "so steals arts"…
ytc_Ugw80efNx…
G
he keeps asking why you cant programme the rules into AI.... its not programmed …
ytc_Ugz4FzsAM…
G
I love how the more he talks abot treatin’ dat AI right da more he souns like a …
ytc_UgzQDX98V…
G
They’ll settle into niches. Google will continue to capture the casual search ma…
rdc_oi0g37f
G
It’s so exhausting that there are real questions to ask about the data set and m…
rdc_ohxtide
G
34:00 In the late 1980s I had BASIC code that implemented a back propagation neu…
ytc_Ugx4TEzjB…
G
The thing about superintelligence is, by definition it needs to be smarter than …
ytc_UgwqEcV4Q…
Comment
Truth being said, AGI will never replace humans, if you have a basic understanding of AI you can be pretty sure that AI is just the next balloon ready to pop. It s just a hype. There is nothing substantial to it. The tech layoffs is just the result of rebalanced supply and demand dynamics. Simply there are too many people attending universities and too little getting a blue collar job. AI will only replace easy to automate jobs, which for the most are entry level positions. But we won't go beyond that.
Last but not least, automation costs a lot of money, it doesn't come for free.
youtube
2025-08-23T04:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwRBtzO650In68PkkZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgydqBdz-kbliVhyeXR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgyruepogiIqV5Wx-LZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxtunJtkSMrxaZLCmp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyPRZ1O_Cn8hoHi6Nt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxnJo-VIYhxhGteJzZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx5hgartjgXVfU-AvF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxQJAWO1eAs9q9_owF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugx8bnydz3PSB59Cymd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgwNjMS_1PGFX6Y-OQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"})