Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So, an AI writes a script and produces a video to convince us that we will neve…
ytc_Ugz1OzuwM…
G
1:30 if you have hundreds of thousands of lines of code in JUST dozens of files …
ytc_Ugzb0Ki6J…
G
Super insightful, but VERY different.
AI will take a lot more jobs, way sooner…
ytr_UgyQEVNPz…
G
@undeadpeak Welp, this might help against some stuff, but, once again, just addi…
ytr_Ugw3O8AU4…
G
I have been working with AI for about 2 to 3 months. Never had a computer scienc…
ytc_UgzXuNinV…
G
@one-of-a-kind6606 They've spoken about a universal income, but if that ever hap…
ytr_UgzOaSAfR…
G
Agree! AI is crap! It's all just another attempt by the elite to increase their …
ytc_UgySrhPn-…
G
Why not just kill ai or limit its training if you want to save humanity…
ytc_UgyUl4H3O…
Comment
Platooning won't happen. We are maybe 6 months away from AGI. Once that happens those trucks will be better at driving and basically anything than any human. The only jobs that will be safe at least for the next 5 years or so are jobs that would need a humanoid shaped robot to do them Plumber, Electrician, lineman ect. or jobs that people will demand human contact nurses, onsite customer contact stuff like that. And that is only because it will take a few years to start cranking out humanoid robots.
youtube
AI Jobs
2025-06-20T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy08i2xYUuLW-KWY9Z4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwWOFrpY5gpW2PgSDJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw6M-1iySU7-rQJwgh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgycyY9iy2igN6lPDHJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyO_QX9Yz9MyTpgT0V4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy1OpdHs34ZAh1kf914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyfeSNYo-G8gu3tNbV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyzzbhO9zYkDbhCWvV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx8bWFIbzy1w1QJV8Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCijwuwkViQTiT-Mp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]