Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI IS SOULLESS BUT DIGITAL ART IS NOT CUZ IT UR OWN ART STYLE WHILE AI IS NOTHIN…
ytc_Ugy2B3McD…
G
The question that haunts me: can a system that cannot stop, cannot say 'I don't …
ytc_UgzF_gWRW…
G
After watch Gothamchess videos, yes AI is really danger!
Imagine AI summoning m…
ytc_Ugw5XT4mv…
G
I had a project going on with gpt o3. It was working pretty well and it's just g…
rdc_n7oeqbd
G
I think that a lot of the people that are like "Nah lol, that won't happen." hav…
ytc_UgiGv6YyL…
G
ngl if someone blatantly steals from an ai "artist" id rather support the real a…
ytc_UgypYoHgp…
G
Powering a human takes way more energy per unit of productivity. AI is very effi…
ytc_UgyRkuWJ0…
G
As a quick example, the AI required hardware necessary to run a call center of $…
ytr_UgwecTIVW…
Comment
This is a distortion. People will still be needed, which means juniors will be needed. I work with AI when coding every day and although it keeps getting better, I always have to intervene at some point, albeit less frequently (both as I get better at how to tell them what to do, and as they improve over time), but I think that asymptote still leaves room for knowledge workers, and the real *insidious* danger is that, lacking full understanding, they will write really insidious bugs that only the smartest humans will be able to solve.
Also, AI's can't pick direction. They have no sense of value or the cost of anything. I have to constantly remind them to watch for O-notation and stop writing to disk (instead of far faster memory) because time and LOC cost money and maintenance.
youtube
Viral AI Reaction
2025-11-23T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzuIV4-Q0TtxEoXKEN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyCsPD0oyBNUBd0ant4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZrfadMbdH3l4W6nl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwsCtAO6fBuhoZrBbt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzrvm4yhXOJEXzrQG94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx77tHWcDTp4H-eNwJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXVQr7PUKycwO2YBV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwzMWdjXm6YLVItTpZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugy4dpOkhhcDLJd-ckl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyL9d2TfN2BMa0wVHl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]