Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
In aviation the pilots don’t blindly trust the automation. An autopilot system requires constant monitoring and timely intervention.
youtube AI Harm Incident 2024-12-28T16:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugyl8WgcmrnYrDrDFfx4AaABAg.ACih3BHCB6AACjDLEeg8rd","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyjtmbwKUNNqbGY0Wt4AaABAg.ACasAebQ7a8ACkY7T4E0pR","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugy7h8iBV5iyn86TRgZ4AaABAg.ACaRybAd8JlACavCorPhEb","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugy7h8iBV5iyn86TRgZ4AaABAg.ACaRybAd8JlACb237DeRs4","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyihgR7CKfx53ckUVJ4AaABAg.AC_GD9zoECZACnu30GAmEM","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxUh0QQ43e_mpgGs4F4AaABAg.ACZ_4RSiocfACaoa1Qo0RO","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugz0SxIJnJuj2f9oaRN4AaABAg.ACZZAWVnqY7ACanLyQqrzt","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytr_UgyKv_Ok2QL8pL7vjhF4AaABAg.ACXDApgu0n7ACXFMuLJSMu","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytr_UgzFhjEQr-9XrKBZuzF4AaABAg.ACWsL0tuNfXACX3zUQ-AQm","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugx3gPQYCCgSNaef_1F4AaABAg.ACWqK-gv9bYACX78LPsYIz","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]