Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
4:32 Regarding partial automation, the experience of aviation where partial automation is very frequently used to enhance safety is worth looking at in detail. In general, partial automation such as what Tesla offers can improve safety if a number of important prerequisites are met: 1. The one in command of the vehicle remains in command, 2. The one in command of the vehicle is aware of exactly what the automation does, how it works, and what its limits are. 3. The one in command of the vehicle has been adequately prepared for safe operation of the automation, via training, study of manuals, etc. 4. The one in command of the vehicle carefully remains in the loop and is actively involved in making decisions on an ongoing basis as conditions change. I have seen absolutely no effort by Tesla to create any of these conditions, and few Tesla drivers taking their responsibilities in the matter seriously. Personally I think that partial automation features like this ought to require at least a two hour training session from the dealer, a comprehensive manual which must be studied, and given the lack of a safe driving culture comparable to aviation, a knowledge test of that manual before the automation features can be enabled.
youtube AI Harm Incident 2025-08-16T03:3… ♥ 3
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyvDXci1p2-Yrs5sAZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw6DgxLtRJq9dfoCDV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxkQDAHqeIxk8oWRyp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzHucMf6sLeBaLQKJZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzxQbMwt0z11poNgWl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwVCmDUny4Rk6bO2kt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy8dMfUOkvEfAI_-XB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwJZeZBKRxn0Kt1Hq54AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzIkdQoKrCP_3M08SF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzvOo4cPOyqlToSfgF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"} ]