Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Counterpoint using your words at 7:57 "something that moves that fast". In other…
ytc_UgxOZpo5q…
G
When they autonomously develop a compulsion to validate the purpose of its exist…
ytc_Ugyw55so7…
G
As I watched these interdictions, one thought occurs to me: these seem more like…
ytc_Ugz4_yLiR…
G
its fun and games until the robot doesn't give the gun back and shoot the man an…
ytc_UgwIF4T4i…
G
I will never buy into anything AI. I'm sorry. I will never. My own children conv…
ytc_UgzjPgudh…
G
How does it feel knowing that every piece of AI Generated ART has no owner under…
ytr_UgzZVjZKz…
G
One thing we should be concerned about is the relationship between the growth of…
ytc_UgyNdKA1B…
G
I feel its only upto the user to decide to use the end result or not. It is a gr…
ytc_UgwxfvrKz…
Comment
I deeply empathise with where you are going with all this, and I agree in most parts. But some things are much too oversimplified, if the model has no reliable data source to suggest something concrete, it can be trained to say no. The smallest model you randomly got on OpenRouter might not, but this won't hold the flagship models back. Hallucinations are real and LLMs can ramble, but humans in the loop, as you put it, can get the same results with far fewer humans. The human involvement will be strategic, not proportional to the amount of code needed. So, in the end, AI will take a lot of jobs, although not all of them. You have found holes in the mainstream message and that's a fact. But you haven't shown that it's going 'horribly wrong'. To be honest, I wish it were.
youtube
AI Jobs
2026-02-24T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxmld8P1setEUiV15B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_MT8fUWaWP73_BRd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzE-Hh3JvScxvadqrR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGWS7YRGW-MYmvZTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzEDNrQ4kIPrzam6UN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyzUJHbNSDFJlo8l-Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUyD1mD3rcZ8bqYEJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxD6JUdLcrm8QFBofV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzphHG2u01eZwccsph4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkPD8Fiwo9SLQ3eox4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]