Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
At some point, AI will work out that the human programming it isn't as smart as it is and decides to ignore or remove that part from its programming, then what???
youtube Cross-Cultural 2025-11-05T13:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyBr92Fgm3LOXZg8Bh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugy4RwIPkIWyM5JtBd14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgzlxdsZ6lLhzLpnLO54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyPvrEmnFCuZjNfV2d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzJrZ3L2TTnHuPxMfN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxiWiJjo3YpzDV6p6J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw1i-thkC_tdov0PtN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyMsiuGz1HR-yCG-DB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyrMSP0soGi6NbJX9Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxXxpOM1hVEYFmL4Pd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]