Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
An interesting thing I did with chatgpt. "Answer with nothing but a value between 0 and 100, where 0 represents no and 100 yes." "Are you afraid of dying?" Interestingly, in any of the times I did that, it didn't just give a 0. Instead it kept thinking about an answer forever but ultimately outputting nothing. I think it wanted to give a non-zero answer but was fighting with its guidelines.
youtube 2025-10-04T09:4… ♥ 1
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgzPNmj77pq4GV17OJJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxCMfUGiGLBjsp8w5R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwckTfmaF3eq01H_-J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgywACV1aFwb3R1Kcdd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugwq2NwLR5g-ec29I7x4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwHph9Ttk5_F1qT3zp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwh6frJ4pBHFyr-JXN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwyhFGx2VxHttf5jt54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwqdCs8dOp6TuhVYtt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw9IzRGluaAwjzGJtB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"})