Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
On Android Firefox you can install the browser extension "Sink It For Reddit" "u…
rdc_nfs6gps
G
Same stupid arguments as for guns. The only thing that stops a bad guy with a gu…
ytc_UgySsjHzA…
G
Oh, we'll know - algorithms can only condition us but full blown ai will be pilo…
ytc_UgzWoKGgM…
G
Even Chat GPT says that UBI (Universal Basic Income) won’t work.
You’re absolu…
ytc_Ugzh3fMOV…
G
Guys u can turn off AI on Pintrest if u go into privacy and data and scroll down…
ytc_Ugzu86Kl5…
G
It makes my job easier and they want me to do it, so I'm doing it. Its not my fa…
rdc_o89w8w5
G
@rockstar8573 I have seen nothing that meets human initiative. LLMs do not even …
ytr_UgxixpkWg…
G
Artificial Intelligence has no motivation to take on the human species. No one …
ytc_Ugy3tN7y5…
Comment
Aircraft don't have pedestrians or heavy traffic to deal with, yet they still require a trained pilot at the controls, ready to take over at a moment's notice, to be allowed in the skies in case something goes wrong.
If DOT has a single functional brain cell, they will mandate the same for automated trucks.
(Not to mention, who is going to be held legally and financially reaponsible for the driver share of any accidents? Those driverless trucks will cost way more than they expect.)
youtube
AI Jobs
2025-05-30T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwYjajf1FXZrbblRSp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx21ac7DPNFZJyYCYN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxKD0qVD5TP9B7hFz94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx9OTapOqyQFjoO-ul4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxPdttHrq5x1DPy4rh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzK81doyFYgk01ZqBV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwkY3k3FBTUONffpSh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwqy8x8Lw4m6groG7l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw54hXBYwo0-kUNafV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyeFJdW3UdwOnOnTYV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]