Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've been seeing almost all of these you mentioned are coming, but those AI idol…
ytc_UgzgqAYiR…
G
Aren't they making a new website like deviantArt that doesn't allow AI art and G…
ytc_Ugx_oOzJ3…
G
People who are ferociously defending AI art are people who want AI art to get be…
ytc_Ugy3T9s6F…
G
It's unfortunate that you can't own a style under US copyright law, hopefully we…
ytc_Ugw7elb1D…
G
Heavy machinery trucks like that should not be operated by AI at all its dangero…
ytc_UgxlHPaEN…
G
I don't think so, for anyone who's actually used AI for programming it would be …
ytc_UgxDxa35z…
G
When AI can burn a 7018 hanging from a harness 90ft in the air then I'll worry a…
ytc_UgwJWt7zx…
G
i'm talking about "disinformation" not only in terms of artistic use but also in…
ytr_UgyN9u_99…
Comment
At no point in history have we built an equal society where all our needs are met. Even though we could do that if we wanted to, the key idea is 'to want to'. We don't want to.
Hunger in the world is not a logistics problem. It involves logistics, but it is not a problem of logistics. It's a social problem.
The same for computer security. Computer security is not a technological problem. It involves technology but it's a social problem. We need never have had to have computer security except for the fact that there are humans in the system and humans are grade A assholes. That's the reason.
By the same token, there will not be an AI-driven world of wonder for everybody. That will not happen. We simply don't want to.
We don't build paradise for ourselves. We have never done that and we never will.
youtube
AI Jobs
2025-10-16T19:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwB_yAjv6TEizxLdwF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzKAshimGLWuVQZSEN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzf0WPxMncseRFBRh14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxdXFiaFJQ_rrzBVup4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwsRdExgxH9FpvaxnB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzJ3G-qqBIwLI8eZC14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgytauU6MVodQ9_zQvJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwjCyuYyuygPhAFDsF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzUvunKInVY_XflIGF4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwxF9S_rioIN8J96sl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]