Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Autonomous cars will perhaps have to go through some super major software changes. These cars may assume everybody obeys all the traffic laws. They don't. Bicyclist (in Texas anyway) routinely run red lights, stop signs, ride on the wrong side of the road, and ride in the wrong direction on one-way streets. Pedestrians J-walk as a matter of fact. If a pedestrian is J-walking and the car stops to avoid hitting the walker, and unless the car behind is autonomous too it is going to rear end the first car. The car behind figures since there's no lights and stop signs it is not looking to make abrupt stops, and although they should be more attentive, humans won't. In this case a Uber backseat passenger gets killed or injured. If the car is in autonomous mode, but the driver always has to be prepared to brake on a dime, then self driving doesn't relieve much of the anxiety of city driving. This is going to be a very difficult problem to solve when objects "come out of nowhere," and you have a mixture of humans and robots driving. In Texas you will see people straddling the lane to cross the street and are no where near a light, and sometimes with traffic on both sides! I can avoid them, I'm used to it, but actually stopping is more dangerous to the driver in the right. Dead right, maybe! Imagine someone running across the freeway, will ALL the cars stop to accommodate? I fear a lot more people are going to get killed, but alas "it's progress, " I'm going to hear.
youtube 2018-03-20T00:5… ♥ 3
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzA5EWM5Q4xi25RxHN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgwbcVw3Uf3IR6y79md4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx74ju0jaeTVcAitLJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugy76zP2XGaQEGazOvx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyZd2Yh2NMkutdd_WV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz3YzmNOO5JGIwwFJ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugw3lLvuQRZ_8tZLfnh4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgzCez9KhrftCgbp_M14AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugy-6ZDWh_jkrsBTqix4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxaSb8GSi-1-mv64R14AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]