Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You can't have true knowledge and understanding without conceptualization. AI has no such ability so it mimics us by converting questions into search or fuzzy logic problems. As you said, we "feel" that it understands us. But it doesn't.
youtube AI Responsibility 2025-10-19T21:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxW7VrcDxo1BYimqA14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxK17EqoJbTiwDdG7Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy0MNzA3MMtDU1e2jN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgzCDMGq460IsYDoUW54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgzDaWml_F-2d66cOC94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyv3qUxf1ffCyMX2xR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugws5h1CdJnByFHdp3B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxrogRh3BVCkTdMSz14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzV4bNvbP_QWNLT7e54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugy9u7RTMNvH6Jq47xN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"} ]