Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don’t think “reducing hallucinations” is the right focus. Yes, of course that needs to happen; but the change will come in a fundamentally different neural network design - different from predictive AI.
youtube AI Responsibility 2025-09-30T22:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyv9Xz4hdgYJlgrymJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgybPUpUsgTGhCTgeqh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxRhM41A8DaWi8KdR54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxrBEy8UNEaNmtgcrV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyVPd0v2fM_XejOz0F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwPa6tSLz05f7AfDI54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwnEIP2W57cmmNRXA54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugy40CVUdgivstnqOaV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzuq-S3BwfVxnB_3tJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz-2lJQHEp9VXyyXwh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"} ]