Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@masterplayz5220 I think you're missing the point. I'm simply saying you're putting the decision of whether you live or die in the computers control. So the computer decides without your consent or control or will it would save the pedestrian and put you head on with an oncoming car. Now in this instance the cars weren't going fast enough to kill the driver, but if you were going fast enough in the same senario you would have been dead. Now I don't know about you but if my choice was to go head on with an oncoming car because someone jumped out in front of me I'm going to try to avoid the person but not if i means I have to kill myself and my family that is in the car with me. But with this self driving car now you and your family die. Now do you still think this is a great idea?
youtube AI Harm Incident 2025-10-23T01:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytr_UgwL5iVASFZ-jvE6_fp4AaABAg.9o64GCwQ_gp9o67CfIYzVg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgzvpEMLjxFBCpv3F2N4AaABAg.9o5dkz_G8QW9o5ouXaiUW2","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugw8oU5uC8r-pv7CE2Z4AaABAg.AUfiKy4bDKCAUzXUeU1XKL","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugx_sNwG2c9X-0sOmY14AaABAg.AQdeXAB_vozAS2yNqKYnuL","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxaDQwGDmYZPvXnGDF4AaABAg.APltwIoNyF3AQTi3SHFK7G","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgygUlMPCGO7GGQ7L-F4AaABAg.ANPOrxMEBvoAUKk2g4qiRr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytr_UgxpoWIBjXDiyoSsNht4AaABAg.ANLDRPCwGq6AOb-gvVk-nT","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugy14gdg9yBJ7UrnVmZ4AaABAg.ANJpUzuTD1iAQxgkY8KXkb","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytr_UgxJ0GdPYWRQkNbcpZZ4AaABAg.AK8JXU4XXSkAKYktXnYlAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwoIXEQUnvksXxjjjx4AaABAg.AIXBJaUBt6fAJw-VuPoFix","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]