Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We could build it like google gemini, because Gemini will just want to kill itself every time it fails to meet the criteria of a prompt
youtube AI Moral Status 2026-01-12T10:0… ♥ 2
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugw_ONl_6CKRXLh0z9F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzpZcLhBBzuTAAeeBd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgyNPJS6EVGeU-MwReh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw059Oqxurx2b6BIVt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzcsCucxIZ8EKXCrJl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwFk-8Z7_jzooQKOGR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwvBiuNKHkRNifDd2N4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugxwu9qOuUNJuB5n5_J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzT1rMDzzUL_jADOmZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugy7c5m40wc034ma5p14AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"outrage"}]