Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
How I think about hallucinations: if you ask an AI to generate a photograph of a cat, of course you know it’s not a real cat that exists - it’s just a bunch of pixels approximating a cat. If you ask it to write you an essay, the facts are also just approximations of real facts.
youtube AI Moral Status 2025-10-30T23:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx533xVo-hSoW3STyF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgznUdxzETHRyzE4L8t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxvpx4B5WAI1AG8d2F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzlZs1Bk1mY4KiAxKx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy0jut33-HQcZnXaWJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw7SCNpTM5aM7M6FdF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzLHDzE6jDrpPtKtnN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyJOTqlMJWZtjCj7894AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzFKq2YDOqwxlaeXqt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgykcxymMbgSrsjYauR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]