Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Someone said we should call ai art computer-rendered artificial pictures (C.R.A.…
ytc_UgxJpfVIi…
G
@bronzergoth7598 bro stop being stuck up, art is art... it doesn't matter if ai …
ytr_Ugx5kr_Wk…
G
The first job for AI to take over in every large corporation is obviously the CE…
ytc_Ugywun-_c…
G
These AI apocolypse people are nuts. There's no way 99% of people will lose thei…
ytc_Ugz0g-oPy…
G
Disney being two faced and allowing AI content on their streaming platform soon …
ytc_Ugx4ju9cV…
G
AI doesn’t need a paycheck, health insurance, retirement plan, 401(k), sick time…
ytr_UgzKK4pCr…
G
Absolutely! Sophia’s perspective on balancing efficiency with human needs is a p…
ytr_UgzR3_tQW…
G
Isn’t anyone interested about talking of how they were all bald 👩🦲🧑🦲🧑🦲
I thi…
ytc_Ugx2uSsOo…
Comment
If you build a bridge and hire inexperienced engineers, deprive them of the tools they need to design the bridge properly, and refuse to listen to the warnings and cautions from the engineers and the bridge collapses and kills a dozen people, the CEO and maybe some other executives can go to jail for criminal negligence.
If you build anything that contains software and hire inexperienced software engineers, deprive them of the tools they need to create the software properly, and refuse to listen to the warnings and cautions from the software engineers and the thing kills a dozen people, the legal system can't hold anyone criminally responsible. No matter how reckless, how absurdly negligent the companies practices are, when it comes to anything with software, there are no laws or legally enforceable regulations. None. We know this because Toyota did exactly this. Their "software engineers" didn't have a bug tracker. They didn't have version control. The automotive industry has 90+ practices that they say firmware for cars is either 'recommended' or 'suggested' to follow. In the trial, it was shown that Toyota's code followed 4 of them. 4. And at the end, the judge threw up his hands. He said he felt the companies executives were negligent, he felt they were responsible for the 12 people killed by the cars experiencing "unintended acceleration" due to a software flaw, but there was nothing in the law he could point to and say they violated it, so he had to find them Not Guilty.
That case didn't get a ton of publicity. But as a software engineer, I was following it closely. When it happened, I knew the future. The first company to debut a completely autonomous car would be the one who cut the most corners, skipped the most testing, and pushed it out onto the roads to beat their competitors to the punch. And then one of those cars will plow through some kids. People will be mad, but when the court case ends with a 'not guilty' again, as it legally has to, I expect they will completely lose their minds. Toyota was found guilty and responsible in the CIVIL case, of course, but then (and it surprised me this was even something which was possible) they offered a settlement agreement to the parties after the verdict but before the jury was permitted to determine punitive damages.
youtube
AI Harm Incident
2025-08-15T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzOo3s7dCdUmfK1dyR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx-5L05VrieA6kAPEZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwb-3GNnbO7lQXyx4R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyp_NGsp3imd6AGx7h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxunoYizgUvT_z8LM54AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgznhymlD-kfe73rvYF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzOtM4K6jn68dDGS-N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYc7I0ecG-ShI4Zel4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzZZK-AyMkpLsVtKHJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwLphmQvL6BN5edN-x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]