Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why a company is going to choose setting the same state for two radically different situations: when FSD is desctivated by human interventation and when FSD is automatically disangaged? Does not make any sense to me. Look at this example or any other similar crash, tesla would love to simply show disengagement occurred because of human intervention alone
youtube 2025-06-06T01:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgzsM780CqQLmpRoU7p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzQp13q6if54rP5-a94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxZ3YCfm1AZNIs92Wh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugwa3dcOME_BsioaxSZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxTJab6fN31mOYW03p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz_joCEi-bCuZslGSB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugw9I5R9nTtoKy1pLrF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugxwk3qd3qYjAUBJsV54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyqToj03HOVG9b0j6x4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxIMr1WjbVKKkh4Xnp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]