Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
They are not fooling anyone. Companies like Aurora are there TO REPLACE human truck drivers. It is WOVEN into their business models. How will they earn money if they are paying the government and human drivers? The goal of automation in their minds is to REMOVE the human driver. REMOVE the statutory benefits that comes from hiring the driver and removing THE SALARIES paid to that driver. Not to mention that the truck without driver can go 22 hours a day (2 hours probably for maintenance). Labour laws limit driver hours to 8. That is ALMOST TRIPLE. AI also does not complain or protest when wages are low (or none at all). If you put drivers in a driverless trucks, they would be worse off. Because the companies will always pass the LIABILITY FROM ACCIDENTS (the ONE THING they can lose money off of) to the human drivers. Even if the damage is not their fault. I guarantee you that. What is needed is for these driverless trucks to AID delivery for an industry that does not earn much and can do a fat lot help from them even if it is NOT profitable: The agriculture industry. Driverless trucks can help them get a business advantage against giants like Target and Walmart by selling AND TRANSPORTING their goods DIRECTLY to the small Farmer's Market guys ELIMINATING THE NEED for grocer's like Walmart. THAT is where they should be. But of course we know US Congress (and Walmart) won't allow that to happen.
youtube AI Jobs 2026-01-03T03:0…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy697s29z09bFKvsSJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz2TKmkZ1oDLw7q2y94AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxujijC1BbBe-6SlYJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgySTdXSMuccytKOtEF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyqjxX3Zp3tPVPjBNl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzsROVs4RxJtKlzYPl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxHXhUU6vVm_8lsWup4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyl8iXCjM2nbswY1zB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugwws6kffqMYd1rfTMV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwGrygZwMV5OubejXh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"} ]