Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
How is it that people want to have self-driving in their cars? As if driving isn't stressful enough; the fact that automated driving is imperfect means that you have to be even more alert and aware of what is going on with the car. At any moment it could misunderstand what it is seeing and cause an accident. For the driver/supervisor of the car, this level of concentration is extremely difficult to maintain if you are not actually driving and would quickly get used to the car acting correctly, most of the time, so you probably wouldn't have any idea what was happening as you hit a car or ran over a person in the road. The fact that we can make a car self-driving doesn't mean we should. You wouldn't let a trained dog drive a car, so why a under-engineered system of computers?
youtube AI Harm Incident 2025-11-02T16:1…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugymu1zzylg7qOYqAHB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzQV_VwR4rIpSdCsZl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxiaMd6-rOd6jHswaZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxVLL2Pi06iMjveVcp4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz4gPv0UE-BKIhR9Y54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy4Y9rvleyHQ7OIgCF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwqRiiJn_sCH_93b6N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxzB3iAjjONeLLjdYp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxni84J59B7EkkKo8t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyhX_sKbSdkKTGyM-V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"} ]