Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
1st to reach a level 5 FSD, all the cars need to be FSD, where you can program a car to avoid drive on the wrong direction, move away when police, ambulance and such are in a hurry, but we are far from that, no device will ever made a decision that's no on he's data base, the question is, what a human will do in that situation? are humans coding the Tesla software? what kind of human is? safe customers life 1st? or evaluate who will suffer more damage? the scenario start bad by someone driving in the wrong direction.....I will say better airbags and better shock absorber. even if Tesla or any other AI try to slow down, there is enough evidence that in case of frontal collision the slow vehicle is the one that get the most damage. Just as a driver must watch the road when driving, a pedestrian has a responsibility to watch where he or she is going. As a pedestrian, you should always look at traffic signs and signals, be aware of your surroundings, watch for cars, and pay attention to where you are walking.
youtube AI Harm Incident 2022-05-21T08:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgxiV0KC4dS7B21VOSl4AaABAg.AH_Ae4pB-DAAHo9DMRrRBN","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwwjFYM0K6zzmUco6N4AaABAg.AGmkTsEAxD-AGnB0-xEqpw","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgxoxCizldEQvmiKohF4AaABAg.9bJ0ZDM6ZF99bJni729qsM","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxyedjuVtALw1486yd4AaABAg.9bIgswh4dtb9bKiY0-HuEm","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugzx32_wxeoG6zs8uyR4AaABAg.9bHSZT-36Jx9bHXo4OgLuM","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugw-Yx2NXtXzQ9EpS354AaABAg.9bHNEvd7R8R9bHZhoc9vq0","responsibility":"unclear","reasoning":"unclear","policy":"industry_self","emotion":"mixed"}, {"id":"ytr_Ugw-Yx2NXtXzQ9EpS354AaABAg.9bHNEvd7R8R9bJ88SI--sb","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_Ugw-Yx2NXtXzQ9EpS354AaABAg.9bHNEvd7R8R9bJBqCRyD5I","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxZyHdydMQc5fnRDk94AaABAg.9bHLf4liR9_9bHnwb7mM3C","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzQJpD_S_DAJT4NNaN4AaABAg.8e-C-Jz0IsN8e4LpCpeS9J","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]