Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s far to early to tell. Gemini was bad and now it’s arguably the best. Meta A…
rdc_mz4e229
G
At least we're doing better than Kazakhstan, right? I bet our internet is better…
rdc_da3mbr0
G
How do you get chatgpt to speak rather than type back to you I must be way outta…
ytc_Ugz2OtVQH…
G
Haha, it does seem like Sophia was really engaged in the conversation! Her respo…
ytr_UgxSPQUB3…
G
Elon : I want to make an AI that doesn't lie
Elon Grok : I am Triggered that my …
ytc_UgxyQOj-x…
G
The are currently taking humans out of the robotics evolution and letting AI sol…
ytc_UgzZXvCh8…
G
I don’t understand—Citigroup fired 20,000 people in the U.S., citing all sorts o…
ytc_UgwzsUqVe…
G
The issue is we live in a post ethics society. If something is of decent enough …
ytc_UgzTMHsxW…
Comment
1. This is a modern spin on "slavery". The largest element of operational cost in most businesses is labor. Reduced labor cost --> Higher profitability. That's why cotton farmers in the U.S. South wanted to preserve slavery in the early 1800's.. That's why many U.S. companies relocated their manufacturing operations outside of the U.S. in the 1980-90's. That's why Ronnie Reagan busted unions during his administration.
2. The irony of this scenario is that so many people will be unemployed by AI and robotics they will not have money to buy the products from Musk, Bezos, Suckenberg, etc. We won't be able to keep their robots busy!
youtube
AI Jobs
2025-10-10T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx8xywTt7furp1YzL14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw8e6T3BOEI4nBqj1l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgytCDDNHvbGMF8tS-R4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzvKQRRAlWsOvXAX7p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugylvz8YHdVUEDnzPFR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwufNRWyPiFdpaxQEl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7vawu7wlEVAJWfM54AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyfMA076qJbRxUg2bd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwzkdhl5Z2Ces2lkCB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwmtxXWCSoDwsDujPN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]