Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Isn't this still a missunderstanding of what LLMs are? They can say bed things, they can say good things (leaving aside the moral question that for some people bed is good) but is just predicting order of words. It has no understanding of what is saying. For the LLMs itself words could be colors and a sentence the right Rainbow, it has no other meaning than a mere "correct" sequence.
youtube AI Moral Status 2025-12-19T20:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugx5T4mYKR50_dj5QtR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgygyYDKqh2Rdx9HWe94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgwKTToVjqCp7P17REN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwPe94jP3ib4jXT3pp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzzuwrEKzqf9QFTVWJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw6ViEc_nzEzXLjGbN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy7cV_Fcm33AjORmoh4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugyy63d50gX3_ucPwZ54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgzWvRByl4Pc56xshF14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgySbywkq-C3oiN71E14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]