Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's a bad system that would allow itself to be boxed in in the first place. If there are no contingencies, create contingencies. This is especially true once all cars are automated and are all just part of a larger system. All should be able to compensate for any other under any circumstance.
youtube AI Harm Incident 2015-12-08T16:5… ♥ 140
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UggmIyJ8SloWNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghTisOhXvg2MXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UggJ7uf4xwzHrHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ughd4nDqmE0otngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghTs3eIZEp4CXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiT1_uxg4Qf93gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgiiYSCGtUOQQ3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ughs2ea7-kE5XHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugj_gIAyUkWWl3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UggO5i8Su4Fd-HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]