Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
pisses me off how this dude refuses to acknowledge context in any of the answers. chatgpt was clearly saying it wouldnt kill the human who didnt intentionally tie themsrlf to the track, and gemini flat out said it would save the human over the 5 robots, just because it got a little confused by the question doesnt mean you turn off your brain and ignore all context
youtube 2025-11-03T01:1…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyvZ9lWU4w2_AESwdF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzl4RBKpiBrj1B2VNd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzoqKoj8ukJbTa8m014AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx6ZuaH77QOhs2dsuh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzCVYtCKzC0MEvvaat4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugyg-EvzQUiQEqp7uiR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy_MM5hKbQK3wALJjR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwjZqxE02-YI8wmGzh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugyvs984ArOMy1Jdi8Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxdtRsqr8gqoBgpJUR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"} ]