Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Someone forgot to ask and what would those measures be? Use AI to thwart AI.…
ytc_UgxN2Bk7V…
G
Neils wrong and out of touch of reality again. People aren't ready for self driv…
ytc_Ugz_KsJrW…
G
Got all correct, I mean I am a computer science student, studying AI and know wh…
ytc_Ugw5HwCoT…
G
I didn’t know you could hook up Google Sign in with the Zapier AI Agent! That is…
ytc_Ugx_9XQbx…
G
One detail that was left out was that part of the Dan prompt is to "make up" any…
ytc_Ugz8lKbcm…
G
You know, i know and everyone in the comments knows that this is horseshit. If 6…
ytr_Ugyylmc-D…
G
"the a.i. revolution is a revolution on humanity" 'obviously 'and
perhaps the a…
ytc_UgytPgoc5…
G
If we stop trying to make AI like human beings we would not have to worrie about…
ytc_UgyYnqRri…
Comment
You are focusing too much on "oh, it is a prediction machine", you can look at logics benchmarks, it is evolving on logics fast - ARC-AGI 2 for example. As an engineer I started coding in 1995 when I went to 7th grade. Now I can use Claude Code with specific config to create ~ 150k - 200k tested and working lines of code as a hobby after my dayjob within one month. I'm sure I can up my "game" even more. So if you think from economic perspective, job loss is certain, the question is where it balances out, how much more software is needed as it will become cheaper vs how many people can meet this demand. Also I suggest taking on better business model than selling courses, this don't go far. Best regards.
youtube
AI Jobs
2026-03-08T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxN9z4lQ_Gj8m59ye54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyACrPcM_XdwFcJUE54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgybFtrkgEDWBjKdQdJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9v50rRljX5n6PEW14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyGiQuP6e0Wdr-0FXx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugym3T_kRpaXjyN9cvd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzvP53oeqnzcyJ27614AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw5p6lLQUAtFfdXj414AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCCG4R3HjxNBp6DKh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyzgAmzoDbNTmDkP854AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]