Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
How about the fact that, on the average, human drivers kill someone about every 5 hours in the U.S. and Waymos are yet to cause the death of even one human being world-wide? Are you Canadians pro-death? If not, you should have an open mind to the potential of the best AV technology to, eventually, significantly reduce the rate of fatal collisions. In the U.S., traffic fatalities (caused almost entirely by human drivers) are now the #1 leading cause of death of children. Do you Canadians want more American children to die? If not, then you should understand why developing AV technology that has already proven to be significantly safer than the average human driver is important can, eventually, save the lives of a lot of children. BTW, speeding is well known to increase the risk of both crashes involving serious injuries and deaths. (This is one of the main reasons countries have speed limits.) Those crash rates scale roughly linearly with speed over significant ranges in the speed. So, if you're a speeder, then you're a relatively dangerous driver.
youtube 2025-12-09T22:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugx0yh-QYQzN9eusVYZ4AaABAg.AJg3JzIDEsdAJv4zfKc7WI","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugz_qzu66GWgFiDUXi14AaABAg.AJg30PpeMy1AJg7wWPt_oI","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugz_qzu66GWgFiDUXi14AaABAg.AJg30PpeMy1AJgCtk8ifNK","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytr_UgxUfU-x6q2WYkyLTGh4AaABAg.AJfzx-g3rtrAJg5NRhPa2C","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxUfU-x6q2WYkyLTGh4AaABAg.AJfzx-g3rtrAJg8rxIxPAh","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxUfU-x6q2WYkyLTGh4AaABAg.AJfzx-g3rtrAJgBtdGDs7p","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytr_UgwT3eNs0IwR2r_VXf54AaABAg.AJfYBjDI0rJAJfdIMPHCoE","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugw1J00X-oiZGA_KsmZ4AaABAg.AJfVyJ2WoTmAJfiJ--GpeD","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxQHA00iE5NynCZOxt4AaABAg.AQVHF2pOiH7AQXOib4HEf7","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytr_Ugx6DD5dEYKD5wZ3yM94AaABAg.AQUaUI01C9fAQXLYK17wlO","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"} ]