Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The scary thing is that the ai didn’t follow the orders of the last command, but at the same time knew that it was technically right. That kind of process couldn’t have been taught, but it learned it on its own.
youtube AI Moral Status 2023-12-25T06:0… ♥ 839
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningunclear
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugzxl_4JUmUQLfb-LMd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxPhbDQCOjPoAEK_It4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugypkbb0H82IknwM4KV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw-V8I6vv5CkCDtlYl4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugy2aTzn0irTn3eK6fV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxuNU22kQlf5xCKFMR4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxVLsl-h5R1rToaZ4t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzjsKwN4xHNDrgQnWx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxMQyPXeQFHr1PkBc94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgytNQCBQl4do-7vyzt4AaABAg","responsibility":"user","reasoning":"unclear","policy":"liability","emotion":"outrage"}]