Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Question: Won't AI take over from fools like Altman and Elison even if it's not sentient?? (and isn't that worse - it won't have empathy).
youtube 2025-10-29T17:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugxc3wK7LEsfDQep1zl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw3Sd7BhiEtPMiGfCx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzriWh-yyMmrpCoZ1l4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz1aGxiMmH_i_Rcdmx4AaABAg","responsibility":"society","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxF9VUMHD3BkKLi5K94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyE4wLnQe3iRgC-BBh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyfY3F3QqauLtKqE6p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw8RuTWEgpCauMAuk14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyBZEAtx0K9lfG4a2Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyS19d5C-cgDWQnLgN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]