Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
LLMs hallucinate 100% of the time, it's just that with enough training data the hallucinations often happen to also be the truth.
youtube AI Moral Status 2025-11-07T07:5… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugxv-zHDsNnmwwXt8ep4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxhKyOJ_g060OafjdF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyqoNgnwMo1UrOZpLJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwfrBkP777wgL4Uur94AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxXZiVeLMT39wiy7Jh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxHi4MjED_ibG6Mem54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxT5lLPOzqUD7pTuf14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyhCIG_TPm0c6Id5OF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwOw1yP3aNyo1iPuPl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyOVY6lrCouVtN3XUR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"} ]