Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Sure, anything could be conscious. Why not? There's no way to ever prove it, of course, so the point is moot. Consciousness is subjective, so unless you're going to turn yourself into an LLM, you can never know the answer.
youtube AI Moral Status 2026-01-27T14:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgzpC_sSTdCABRcpkcB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwZBIHr1lIRG8Da4BZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzlbOgf_YXMfk0KUW54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzhvsPs_HukQQKjnhh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx907P2HV9Jdpi2lLJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz576o4cXWfR8xOpRp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxnzx6dnIVgf8w6Okp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx-b0bTRNWx8VKX9d54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwv8EFs_jnNfnSU_ax4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyNeYV78JpvKIi3PNt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}]