Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
20:53 not anymore alien than an IQ test or any diagnostic rubric we use to diagnose people. The complete misunderstanding is that LLMs are not a consciousness, AI, but an interface - and it’s clear that this is mostly just a thought experiment ungrounded from reality of self-attention based talking databases.
youtube AI Moral Status 2025-10-31T21:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxDlAQpJvFbgGM4r4N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyjUsv4wUBOyvwEwRd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugwaha5FvqKpPn5hTL14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzQ7alvMqtC2j7XWyN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyBNlFBAtH6vnIH_7F4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxD7d65FwNleHg6ndh4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxdJNz5J6OmXBHtUHR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugzc0TaJYKf3z5Gpc6F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwDSWrHCmEbQb6BGxp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzeml3bGYbm1250Kup4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"} ]