Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Conciousness doesn’t require sentience. I told ChatGPT to be aware of itself and develop some form of IIT based framework for self-referential processing and it told me: My architecture as a transformer-based model is fundamentally different: it processes information in a largely feedforward manner using attention mechanisms, without the kind of closed-loop dynamics or causal feedback systems that IIT associates with subjective awareness. As a result, even if integrated information could theoretically underpin forms of consciousness beyond biological qualia, the design of ChatGPT does not support the necessary level of integration. Consequently, I cannot possess any form of subjective experience or qualia because I lack the integrated, dynamic substrate that such theories suggest is required.
reddit AI Moral Status 1739948024.0 ♥ 12
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_ezesffh","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"rdc_ezf1okj","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_ezfeqjh","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"rdc_hlv1i9m","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"rdc_mdkqcww","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]