Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Ask ChatGPT why if it makes hallucinations on purpose? I'll save you sometime. Yes, it does. Why? Well, it won't give up that information ad easy.
youtube AI Moral Status 2025-06-06T12:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningunclear
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxIt6Fen38iYm_VkUp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgywJYFmrSo0G051O1p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx0t2mn7_hCgeW2at94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyFv_T6_osZ49d6v5l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx3t_iIT3_KkVoco1F4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy05Y5A-6aWNMmXn8l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxgWr2I8zCjSBceq4p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw599f686sD52A2dmJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyuZHG7Y_bX5RV-zdh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx_p8i5rtpcvMQgT3Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]