Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
18:00 sooo, youre telling me AI basically predicts information based off of what it thinks should be right? tell me if im wrong, but isnt that how the human brain works? visual illusions are my best example of this, and imagination comes from inspiration. the idea of calling it a hallucination kinda reminds me of how no person really see the world the same. that is hella interesting
youtube AI Moral Status 2025-11-12T16:3… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgylT8svfl2oMUW4U-F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyLfTtZm68taB7U9cp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxN-eoii1kT-akvwbF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyxl84xUl_8ihgo6Oh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyI7zZSTLvif6F1Eex4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy_r8BM6oN8TggtiKZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw-fujppI_piFchIax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwpWiqiuGSZK0WrC594AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzoqF7ccItFYIIxCMl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxj-Qku7l1HM-CHoU94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]