Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Tesla FSD is like putting airbags in a Yokosuka MXY-7, eventually, maybe once every five years the AI will get confused, not see something like a dangerous load on a semi or police chase on the opposite side of the highway and there will be a serious crash. What happened to good simple engineering, failsafe mechanical systems, basic driver controls, respecting human intelligence and imagination and all the research in auto, aerospace and building design into safety and fire safety. Worlds gone mad, the future is electric, maybe Donut Labs, but not this insanity and appalling engineering.
youtube AI Harm Incident 2026-01-10T17:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_Ugy8heufm61nZCoEWd94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzxO_TZYjFwankaU6Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugxf0ITWdMYDEvUzD-d4AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyuAkuDGuf71E7BRcl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwJ-ClA9-zesBejv6F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]