Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There is no ethical dilemma. All you do is require the AI developer to be insured instead of the people who own the vehicle. This creates a single buyer insurance option organically and overall lowers the cost and ultimate impact of any foreseeable harm or damage the system might cause. And it coincides with the ultimate goal of any company, profit. In fact their potential profit is supported entirely by their ability to protect the people using their services from any and all damage and harm. Win/Win In a world where idiots are not modern versions of royalty, this is a non-issue. But because stupid people need directions with crayons to find the bathroom, these are problems. Why create imaginary problems when there are real problems that need your attention?
youtube AI Harm Incident 2015-12-08T19:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningcontractualist
Policyliability
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UghSiRcVXA-3FHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugg36gd_wQOCXHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UggzSEiGsQNLKngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghidMHZsCybB3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgjzNTXzuzIxOngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UghfmsovrnUJPXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgjQy7gtc5pA_XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugio_pXgICTxCXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggKQCpjXBYZKXgCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgggitcG_CbrUXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"} ]