Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah, I work exclusively in Elixir/Erlang and I'm aware that it may provide a be…
rdc_nm1cjw5
G
Ok. So AI is doing what everyone else is doing. Taking something that already ex…
ytc_Ugx7vWaBj…
G
Stoping human progress for fear of the unknown consequences is like cutting off …
ytc_UgiEfPymB…
G
Devs underestimate the complexity of devops. More AI crap code will mean more pr…
ytr_Ugxls55qu…
G
To me, ai replacing humans is like the same thing as saying... "the porn is goin…
ytc_Ugwcbqtuz…
G
Haha, that’s a good point! The spelling of "Sophia" has indeed been around for a…
ytr_Ugy70lMm8…
G
why optimis future about AI ? how about WW3, AI still alive ?
your brains about …
ytc_Ugws_QmCS…
G
China is actively incorporating artificial intelligence (AI) into its education …
ytc_Ugy-kXi9h…
Comment
Technology isn't the problem, how it's used is. Robots emerged to replace human labor. To better understand this, just imagine them working for us—that's always been the intention. You'd have oranges at home without necessarily working for them. For those more uninitiated, I'll give an example: imagine a robot could speed up the waiting time for harvesting various types of fruit to one-third the time. Also, assuming that such speeding up didn't harm human health, you'd automatically have more fruit available. In other words, fewer people would be facing poverty in that region if food were distributed fairly. Don't rail against technology; it's inevitable at this point. Just rail against how it can be misused. In fact, if I had to bet, I'd say that, among several things, technology is the one most likely to be the one to destroy us when misused.
youtube
AI Jobs
2025-10-09T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwtOYGM2-N_Dy2lmPV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzNILvS8Bc3w-RML694AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6ahTdoUwl6t_o3Gl4AaABAg","responsibility":"government","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx1TKUYRnFa9-NC_vt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzvJRHpsP5qySmaTLB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgynTWnmmfjWxLpd8b54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwIqWFefiqDVPKQmwp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyFbQ_QIGl4uZXhx794AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzpGUkEznxBr3sfRqB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzarUPBCgVIGviv_EJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]