Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
16:18 that’s not why LLMs “hallucinate” or why they don’t return “I don’t know”. Without additional infrastructure, LLMs can only provide results based on the relationships the model contains and the relationships that are engaged by the inputs.
youtube AI Moral Status 2025-11-10T21:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxLcyYFwx7NeEUT9wB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzgpyTtDUUpNA5pTTF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwsscmCQD8OW7DlVCd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwJjctm4NGlg3_N33t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzC4_e1OB3A5QQGEjR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzL6ob7Cwc5zyr9vUJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxD2GDorcCPyUjp09x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy30ZTxjxKOZ-bosVJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxWxX1vRL-3sJtRug94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyWjp2s39Xw1b-8fxx4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"mixed"} ]