Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
there are several ways to avoid accidents like this 1.-make trailers run on diferent roads where cars can not enter 2.-if the first one is not posibble (which I find it pretty easy to accomplish) make every car be no closer than 3-4 seconds apart, so if something like this happens there is chance to move to one lane, to the other or simply to slow down, and since the rule of the 3-4 seconds is still working the car on the back will stop as well as the one in the back of that one and so forth and so on of course there will still be accidents, if something can go wrong IT WILL GO WRONG, its a law called "The fuck you law" or more commonly know as "Murphy's Law", many of these accidents can be avoided if we make research all of the car accidents that have taken place since 1950 to today and come up with rules that will save millions of lives, and with with upcoming accidents the self driving car program can be upgraded. Now, before anyone gets angry at me, I know this is just a thought experiment, as stated in the video, "reality may not play out like our thought experiments"
youtube AI Harm Incident 2015-12-12T06:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyregulate
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgjY5ZbRHpZbl3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugg6vkyHWXADQngCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgiE3qm0bdtqengCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggVGd5tRkaKZHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggsLujeKwbCNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ughj07npbLjXPngCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UggP4ePx319A6ngCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugj-9FzhtV_B43gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghfJlHACEBRgHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Uggx47tC_oo6mXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]