Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"They can have thoughts in sentences and then go back and think aboit those thoughts" ...Noooo, they can generate text and then generate more text about the tezt they generated, that's all. If you ask an LLM to explain its thinking it's just making you a nice little story about a computer that had a thought. Because computers can't think.
youtube AI Moral Status 2025-11-12T19:1… ♥ 3
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgylT8svfl2oMUW4U-F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyLfTtZm68taB7U9cp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxN-eoii1kT-akvwbF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyxl84xUl_8ihgo6Oh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyI7zZSTLvif6F1Eex4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy_r8BM6oN8TggtiKZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw-fujppI_piFchIax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwpWiqiuGSZK0WrC594AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzoqF7ccItFYIIxCMl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxj-Qku7l1HM-CHoU94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]