Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
These "hallucination" normally appear because (in chatgpt at least) it can't give the right information but has to fill that space in the answer so it just puts out some words there. :)
youtube AI Governance 2023-04-19T13:4…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgyTUbB8dI4r5dCCo0p4AaABAg.9oelz33W1dw9ofDLBeSbhE","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugyn-3x70SrOXLns7kB4AaABAg.9oeldZHVkUh9ofI3Yk-kSH","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytr_UgwYZy_NfwnGJ25O3KJ4AaABAg.9oekg4XrEYD9og7bPDvu4Q","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgwYZy_NfwnGJ25O3KJ4AaABAg.9oekg4XrEYD9ogL_i7SIMm","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyA8zpEpul2HFq3WmR4AaABAg.9oejXhobwc29oekYWEyZX0","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UgzvXg0Plhq3PV9X6hp4AaABAg.9oeggq48RXf9ogcaK57TE2","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgzvXg0Plhq3PV9X6hp4AaABAg.9oeggq48RXf9oh4-b25w-N","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgziWOaTSjVb4GzmXaJ4AaABAg.9oef0ZoHCh79ofDmoIkVIX","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgziWOaTSjVb4GzmXaJ4AaABAg.9oef0ZoHCh79og_MPTAD2B","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgziWOaTSjVb4GzmXaJ4AaABAg.9oef0ZoHCh79ognxeXjF7H","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]