Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
doesn't the word auto pilot insinuate that it, well, automatically pilots? so the human has no need to pilot it
youtube AI Harm Incident 2025-08-15T19:2…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgwuRx9UpPhP587tdo14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgzSjj9Tp60Cr89I_tZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugy495fkc9ChMossIzB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugyk1L8QXTLa0ZHUmjh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugw6Jio8EXR8fpft5eR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwRgotJplF-O_rekRx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxwOlyGLFQVbRUGKN54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwY0Ozrgn3a-9CpDex4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx9UWRQflY_Lol5RJp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwKsTeKHXeqYl2q-kd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"})