Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"That is another problem", while speaking of human emotions; FIRST mistake folks. "Hallucinations", that is another problem. Therefore you take into account defective processing of AI and discard emotional responses in humans. That sounds a little like the moment Carter sent operatives to Iran, without night vision googles, because they were not invented or implemented yet. LOL. Old mistake, new circumstances.
youtube AI Responsibility 2025-11-12T15:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzH2XJuLgr8bvKoiBt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzaVfo2GHDxJ1K8J4p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw2RM_J1Wwr5EchNQp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzK9WCQtfBY5jN_vn54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw3-gXKUfZro6KZODZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxeBfO1dPRM4XUD7YB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgybOEm29bZc7TnkH3F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwYXuDW5ctlBGuwtkF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx6fl5Msy2gPfTA4Gd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzeRIWx8wT40XW_Kmd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]