Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Self driving has always been a very irresponsible technology to push, not because the cars themselves are bad drivers (or rather regardless) but because the intersection of of human and automation. A good example to look at is aviation- there the autopilot is treated like an extra crew member that must be actively monitered to ensure safe flight- accidents have happened because of an over reliance on the autopilot which combined with a loss of situational awareness has resulted in planes crashkng jnto mountains, or panic responces and subsequent crashes when a fault causes the AP to disconnect. Sriwijaya Air 182 and Adam Air is a great example of this. As a result pilots are trained thoroughly to monitor the AP and ensure it is flying safely, and as it stands commercial aviation is very safe. Do i trust the public to have this standard? To monitor what their car is doing? Noooope. There are clearly holes in Tesla's 'Swiss Cheese' that will continue to result in the death of people. You dont want to drive your car? Get a bus instead. If Tesla actually wanted to improve their cars and its automated system they should take a positive growth attitude rather try to deflect every criticism. If this was a plane company instead then whatever problems boeing has recently faced would be a drop in a lake compared to the s***show compared to Tesla.
youtube AI Harm Incident 2025-08-21T05:4…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgyTLKSafrn4zPpZ-Dl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxuQD8imFwWDmsoCc54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy6Qt8MoKo04nb6SUR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzkRnF-VVM1r7IYV214AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugyfq8fJT8LwzZRUJ7F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyUVM_ZksaPMWSoiYV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugym3XSbgjlEFoovVuZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugzze9fmnuzv-ctekxh4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgwP3GYKOK3kJ1gL9Lx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzA_HfhxAJ5zJpDJTR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}]