Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI should never be used for things like this without human oversight. The main reason for this is that you can't hold a machine responsible if it gets something wrong. You can however hold people responsible.
youtube 2026-01-05T15:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyliability
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwvnCZ-0t4rKiP-LcR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzXGWrzdb0r6UrC-h94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxFeWpvq1hvnhu4Dct4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwI-dyBSdAYewAlZSJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxQgBCGrIgZQpZPool4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgzoLw3EpWanblxC-Kh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyIa00mXIutdUYxm8l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyyLQm7uXiNu9jeo914AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzsTTfUQ1y_ctFEaSp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugwo2ycOQFA8iwTyXql4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"} ]