Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is long, so if you want the TL;DR, here it is: we should just ban self-driving cars and build walkable cities and robust public transit networks instead. But if we’re still going to let autonomous vehicles drive, they need to be limited and the companies making and operating them need to be punished for every mistake their products make. If we’re going to insist on having these autonomous vehicles, we need a strict legal limit on how many of them can be in the same city at a time, every accident or crime involving them needs to be publicly acknowledged and there needs to be a law where every autonomous vehicle breaks the law or is involved in an accident of any kind, in any way, the company or companies that made, own, and/or operate the vehicle should each have to pay a percentage of their monthly revenue (calculated before tax), and the percentage changes depending on how much damage was done. This should be calculated in 10 categories, with category 1 being calculated per laws broken, categories 2 and 3 being calculated per accident, and categories 4 through 10 being calculated per casualty (and I would argue that the percentage of revenue the companies have to pay for those last 6 categories should exponentially increase with each additional casualty). Each category should be cumulative, so if they break two categories of damage, they must pay the percentage for both. And if the company can’t or won’t pay, their entire product line should be recalled. No getting out of it by dragging out litigation or settling out of court. Also, the minimum percentage should be at least 30% of monthly pre-tax revenue (none of the usual “they were fined 1% of their earnings after they killed an entire class of kindergartners” nonsense) The categories (in ascending order of severity) should be: 1. No injuries or property damage, but laws were still broken 2. Property damage of less than $1000, 3. Property damage of $1000 or more (calculated per incident) 4. Injury to any person or animal that does not result in hospitalization or death 5. Injury to an animal resulting in hospitalization 6. Injury to an adult resulting in hospitalization 7. Injury to a child resulting in hospitalization 8. Death of an animal 9. Death of an adult 10. Death of a child o if, for example, a self driving Cybertruck is speeding, and as result it kills a 4-year-old and causes over $1,000 worth of property damage, Tesla would need to pay the fine for all three categories of damage (those being, laws that were broken, the child that was killed, and the property damage that occurred). If there’s no financial reason to do so, then these companies are not going to bother making their products safer, although personally, I think they should just be banned entirely. They’re unnecessary (we don’t need to replace human drivers to make the roads safe. We all know the only way to truly make the roads safer is to build walkable neighborhoods and reliable public transportation, stop blaming pedestrians for being mowed down by cars, and increase restrictions on who is allowed to drive), they encourage bad city planning (we all know these companies are going to try to get human drivers and pedestrians banned as soon as they can so they can flood the streets with their stupid “smart” cars and pollute the air until it’s fully opaque) and on top of all that, they’re ugly (do you really want them to be the only cars you see?) All these companies are really doing is stealing jobs and wages from humans (Uber and Lyft did this too when they replaced taxis, btw), ruining cities (they won’t stop until they’re the only cars on the road and then they’ll push to remove speed limits so they can make it impossible to walk anywhere), and destroying our health (air pollution is not a joke, it kills 7 million people every year) to increase their own profit, and it needs to be stopped.
youtube 2026-02-20T03:2… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningmixed
Policyban
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgxP0aa13Ywaub4NGi54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxbcfb4zlUSOq9FTZB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx_iDv5zZgZJC_7-lF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy3b_aPWTjkIr23XKB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxCkdCqavMCcgGIpbB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyT54OguAEp7v9QYOB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyU1jAjs_RWClWYvEN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_UgwTaWME-kzfED36mJd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugwb0cNoA9ll95Hg2KR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz9349-RtrUGW1Ms3t4AaABAg","responsibility":"company","reasoning":"mixed","policy":"ban","emotion":"outrage"}]