Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree it’s annoying to talk to a standard automated call handler - but I actua…
rdc_jrpekof
G
Sometimes I wonder if one day the top 1% really do automate everything and AI ta…
ytc_UgwsVA-aj…
G
Educated idiots...
If the ai is fundamentally an idiot, just cramming more inf…
ytc_UgyeM_CGQ…
G
Many AI researchers are leaving large companies not only out of fear, but becaus…
ytc_UgwaaAvzP…
G
Nobody ever seems to even mention the fact that AI and our increasingly digital …
ytc_Ugznyj19k…
G
I'll be honest I'm not saying DA is in the right here in any way, they should no…
ytr_UgyBOuhnR…
G
i was trying to use all ai in the market to analyze mame c++ code for a home pro…
ytc_UgwXhPEcC…
G
When i first heard of ai (and before I learned about the whole unethical backgro…
ytc_UgyiXUplF…
Comment
I'd like to see a law requiring that all robots employed by a corporation be paid the same salary as a human worker would - and that all robots must be owned by a human person who actually gets the wages paid to the robot. All existing labor laws, such as minimum wage, must also apply to the robots. Additionally, each person or entity should be limited to just a few robots...in other words, a single person or company can't own hundreds of them.
This would allow robots to do the jobs, without people losing their sustenance income or big companies taking over both the means and profits of production.
Or, you know, just make it illegal for companies to have above a specified percentage of automated vs human labor.
youtube
AI Jobs
2025-10-09T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz8oDCp3o_ZeeUNpol4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxzr4FTrgmbXLGKdKJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxEPiF5mzbrPho8EFF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQRtYqnc3iOl_q-ah4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwt3mTFVUySCpx8sh94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxvcDXgC7-xUx_NCAt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzM392l4VwKRTDApAp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxVR3lafyXR1cpSMJN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwVQvCcpKBtZE0cx8J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxtvpZ4Y-EC17H61HR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]