Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As a motorcyclist this is super concerning given the growing popularity of this technology in specifically these vehicles. However it's important to note the united States regulations and distinctions on what is and isn't full autonomy. I realize other country's may have different regulations but as these accidents happened in America I think that distinction is important. There are several levels of autonomous vehicles in use in the United States that the government recognizes. In order to have a fully autonomous vehicle that requires no driver most manufacturers use lidar technology (largest I know of that does operate a beta ride share service is WAMO, they have logged an impressive 3 million miles across there fleet with a total of 6 accident I believe, and in nearly all cases it was another non autonomous vehicle striking theirs) Tesla does not use this technology and isn't classified as a fully driverless system by the U.S. their labeling it auto pilot makes sense in some respect, as it is a hands off zero I put from the driver feature, but I think as is with a lot of new technology the legislation and regulation will never be perfect, nor will the implementation of the technology unless a wild amount of redundancy is added. It is very unfortunate that these accidents occurred, but I think on the whole it is the driver of the Tesla's fault given the current U.S. regulations.
youtube AI Harm Incident 2022-09-04T12:4…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningdeontological
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxnQ9jGYwm3E4D1e3F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzdyq7FCgl-kjhMkAJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyXRFnNj1lnkiDGiDt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyyJ7VJqi8_ljnN6dV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwORKHDZUkPsTXsPDZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz9iSIleL_wGmyCKeF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyQE_XoixBkBd1JC2R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw0_wutpzvDw5lJB794AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyJd9itNriWyRrMVq54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxiokvASqBZ-s40QBB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"} ]