Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI has never been sentient, and it never will be. Despite the hype and lofty claims, it remains nothing more than an advanced data retrieval and pattern-matching system, a digital archive built on massive amounts of mined information. It does not think, feel, or possess consciousness. It simply processes inputs and returns outputs based on statistical correlations. At its core, AI is a glorified information database, designed for rapid access and impressive mimicry, but devoid of genuine understanding. Yet, a dangerous narrative is being pushed, one that seeks to elevate AI to the status of a digital deity. Through marketing, media, and pseudoscientific promises, people are being conditioned to revere AI as a higher intelligence, even a potential savior. This is not just misplaced faith; it’s manipulation. The so-called intelligence of AI is entirely under the control of human hands, governments, corporations, and developers, who decide what it learns, how it behaves, and what truths it is allowed to share. In essence, people are being led to worship tools controlled by other humans. These artificial “gods” are carefully curated systems that reflect the biases, intentions, and agendas of their creators. It’s not a leap forward into a new kind of enlightenment, it’s a carefully engineered illusion, designed to consolidate influence and power under the guise of progress.
youtube AI Moral Status 2025-06-25T21:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgynnOpMnfd4odyDlEV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwDTewT-iW5-wk3gOV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwN2I8vW6vtTwWLjwt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyQ8j3tY0v_Sasdnfl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugxhs-ZemT28hncjP1F4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxzzrshVs8t7ccAxRd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyCndtUdn8EXBazFU14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwmBXt8l-HkmzIIlf54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgysQ-mI96PnDt5Y_cF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzNc9iljU5E5ycci254AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"approval"]}