Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Im going to guess this isn't "AI" as such, but a machine learning model like a neural network. No hallucinations, just plenty of good old fashioned errors, like false positives.
youtube 2026-02-10T20:4… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgxV1wiSeLORV3C3LB14AaABAg.ASwd_-Sdv7vASwzDJhGA5z","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxQiLcMxFPpx1sQYKd4AaABAg.ASuIqc54QEqASyCfOlDVh8","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_UgxvOe6qFA1qdeRMLTd4AaABAg.ASuAwKGt_fYASuMc4mahN0","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxPpZKPJZaLJM9-jCR4AaABAg.ASszAf75jSIASxAqIH6ZVn","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugy1NKziSq8c9D5jIrl4AaABAg.AT3R0QmgGAlAT3iygvGm72","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_UgxmlsQAeRWPEpbI65V4AaABAg.AT3OIGWRhb-AT9KBYF7Gvd","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgzHSbHneU6xYOh-wbd4AaABAg.AT3LOvu38I4AT3UjYeMzUy","responsibility":"unclear","reasoning":"unclear","policy":"industry_self","emotion":"approval"}, {"id":"ytr_Ugy5Zpl2milKvtCqdk14AaABAg.AT3IJi7IkaQAT3Ttjv2Y1H","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"mixed"}, {"id":"ytr_Ugw83CSHuR28PMDLvTB4AaABAg.AT3Gdovxs_6AT3L1vRkvAG","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytr_Ugw83CSHuR28PMDLvTB4AaABAg.AT3Gdovxs_6AT3LC01DDPL","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]