Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Maybe this will stop war, or at least it's not humans being slaughtered in in a war. That has absolutely no justification for killing at all, but they're smarter than that. They will make us don't make humans go to war. There will be no government like can you imagine Trump? He can't get along with a human much less a robot or an alien. So you think about that for a second and there won't be no government there'll be no reason to. And to be honest I'd rather have an president than what we got right now
youtube AI Governance 2025-12-25T13:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxhPPKZY2KTp88jVPN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy8sfFtHfTKqK-gO4h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugzs1TP6JhqHwmMW4gN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzNobJa9Q2YPODre954AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxKSwSo7KesN6UouIJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzYzV7CHoJr0Vq34YR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy10c9rFGpEIWyZn994AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyJZoZKzRL647WxmZB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyQkMsvgOPuFd9HUR14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyxwntR-cQyxP8esnR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]