Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
13:21 If anything about LLMs freak you out, you don’t understand them well enough, and you have taken too much of their marketing terms at face value. LLM inference is not doing “thinking” or “reasoning” and have no “awareness”, even if those terms are good at selling a product, and might be useful shorthand for researchers.
youtube AI Moral Status 2025-11-08T17:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxDmo18c2vvdm1yQ7h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"respect"}, {"id":"ytc_UgwGAlQGZLoSE-kNHEN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx0pkUTj6ztRmqe7uZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzwMdCLcVnMTJGqkut4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwKSjjPDLtSP49LfhR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz5TSj3WYtiZAakzZp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx-pyFjAE_0WygVeJx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxMQUrhvcX5Pv4ODC14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgwDtF7IlUnNsyMGMSJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwLyuIC0e67JM9LqrJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]