Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As an AI researcher, I disagree with you on removing radar being purely a cost cutting measure. Sure that might be one of the reasons but I'm pretty sure it's not the main one. Doing sensor fusion is very complicated and error prone even with regular programming. It tends to be even worse with ML systems. It's the age old "don't change 2 things at once" problem of engineering. Lidar by itself is a good tool but Elon is right in saying we can't keep relying on them going forward. Due to shape of a car, Lidars have to be massive, unaerodynamic, power hungry beasts. Perfecting autopilot with Lidar wouldn't make sense if no one wants that chunky boy sitting on their car. Also, having perfect autopilot with Lidar's help doesn't translate well to vision based self driving. My feelings towards self driving remains unchanged. Sure, until we figure it out, it might kill thousands of people but that's a sacrifice we must make to eventually reach almost completely safe autopilot systems. It's not like humans are perfect drivers. Millions are killed each year in traffic by human drivers. It's like how industrial revolution caused deaths of millions of miners by black lung and mine collapses but it also eventually lead to doubling of life expectancy in all humanity. Sure, It'd suck if I was one of the miners who died at age 30 due to black lung but at least thousands of people got to live better and longer lives thanks to my sacrifice.
youtube AI Harm Incident 2022-09-07T09:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugzen4vT07S2KEDY2mh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgwCuPHyCT-CFkemolt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz1fsgbuSzwbDBWut14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw1WkngcmA5CoE81AF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugxd-clwNyko053dtNx4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxZ7XlxT3q-bSquW7Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzqOOsuHd-j_ju45Mp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyJeNPMb_ciiGerQG94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzZKOA1huf7b8bXWER4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwDFBXC2ISH5ZhZywl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]