Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The other day ChatGPT gave me an anagram (I asked for help writing a crossword clue) and it got it wrong - missed a letter out. “That’s not an anagram” I said. “Oh yes, sorry, you were right to call me out” said ChatGPT. It can’t even get tiny things right which are obviously either right or wrong. The conversations you are having with ChatGPT are completely pointless because ChatGPT is rubbish.
youtube 2025-12-21T11:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxEt99FwfdAKQshDlp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzRilNcVOrllGHMaq94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz5qQhK05O0vRVGDOV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzAEiFJ5KgMfEChUkd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzGpdoApo3DyPCnQ-l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzCubtwE6tjsmXg86V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx9bGvYouVKLmMgykt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzgzvIvBEPPDxInb894AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"}, {"id":"ytc_Ugx6cGCVWj8iN6dtg594AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyXROI55oEhjBwRI6F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]