Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem with AI in the police is the self reinforcing loop. Anyone that is p…
ytc_UgyEV2ABg…
G
I feel like much of the general public doesn't understand that AI will have godl…
ytc_UgzXVt0s6…
G
Conclusion: my ex wife is ai because she doesn't have the ability to feel remors…
ytc_UgxmhZglq…
G
AI could replace even your Anderson Coopers LOL, virtual newsreaders etc - not j…
ytc_Ugz3wvu7h…
G
Even with the AI the modern society has no problem creating bull*hit jobs and ke…
ytc_UgwpWHhat…
G
Things like that piss me off. Platforms and human artists are singling AI music …
ytc_UgwHITm2C…
G
WHEN AI fails, the US government will bail out all the big tech companies respon…
ytc_Ugxd3cGj9…
G
I'm far from an AI polyanna, but I don't think the main point here is valid. Co…
ytc_UgyR-hVBg…
Comment
The liability costs for human-caused accidents are spread out. All of the liability for a fully-autonomous vehicle accrues to the manufacturer(s). Read: "deep pockets". So even if the overall accident rate is lower, the cost of litigating the inevitable failures will be untenable. The only way for this to be viable is for the ultimate failure rate to be infinitesimally small, or for government to intervene with legislation that deprives individuals of their rights to full compensation. I know a lot of people are in love with the idea of millions of fully autonomous cars on the roads but 40 years in Information Technology leads me to believe we're not as close as is generally believed.
youtube
2018-03-21T00:2…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgykkJHY2zNi0eHSo_R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw-wvkKZc9KIwiNtj14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy0o7Dc_M5GWXuHKHN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwPryzJFiadJAgTPu54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJapNITb9mBuq_T1Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNSoZT7mB4mHEBUfJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxJnkToSlwv0ZS6Qox4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy7WP_A0S-nsWLEsTx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzQVReE7RsUkUYUn494AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxWbQwzYPtwc9j8kOR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]