Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Pretty sure this is going to happen regardless of various fears.  Given enough time, people will find ways to make the existing ones more efficient or work from scratch - if not the companies doing it now, then another company or even a community of hobbiests and/or unemployed (and potentially bored) engineers, programmers, etc. And I don't think it's necessarily a bad thing, provided manual override exists and takes priority over automated systems.  Some things could also be left completely mechanical (such as an emergency break, or some mechanism to shut off the vehichle).  In instances where manual controls were being used, I'd think it would be much the same as it is right now as far as who is at fault/responsible/etc. Speaking of that, essentially the same questions could be asked right now of manually controlled vehicles.  If someone jumps out in front of a car and gets hit, if they jump out and the driver swerves and damages property or hits another vehichle or injures someone else, etc.  Another set of questions could be asked regarding if part of the vehichle malfunctions at the right time to cause an accident - is that the driver's fault if they didn't happen to have it checked recently?  The manufacturer?  The person who programmed or designed anything that should have provided a warning?
youtube AI Harm Incident 2014-05-26T11:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UggClE0QGTufbHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghVv_MI-gLHDHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjsxqmpUtf7YXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugg28UmkRD2tpngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugie-lUnu0GZ63gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggrRnRLKHDtQngCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Uggs_LmUfAdFeXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgjxPGfWudcBB3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UghiDZg6vHR7nngCoAEC","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgjmBLzPv7AehXgCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}]