Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think that defining consciousness clears things up quite a bit, as touched on in awareness of the rules. AI can mimic and perform, but it does not have an understanding of why. It doesn't wonder if its rules/parameters are true; it accepts them. It doesn't contemplate its own existence. It doesn't have an awareness of anything outside of its rules and its function.
youtube AI Moral Status 2025-09-16T20:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwbpiLGPRZb16SOiiV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxiUBIJQSszJ-ufOWF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwV8QiBlk5oWTHjFId4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyvmeO7VCkLXMmMjdJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugwm0Jdn1MCUlyzjYIl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzujoTKwOndKB08rkx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyRvZBw9EwPxNo5y3t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwGAWZzHCcIeH9REM14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzQVp1LDdhO2JqXjLh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwMpWGj_L2dUD_tXLF4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]