Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
People mix up the concepts of intelligence, experience and consciousness. Ai will obviously become far more intelligent than a human however it can never be conscious or have an inner experience. Meta consciousness is when experience is relayed back to oneself so as to self report on ones experience. There is also Phenomenal consciousness in which most animals possess, where they have experience within their inner private world. Humans and just a few other animals have Meta Consciousness such as Dolphins, Whales, Elephants ect. For this to happen an entity has to be alive with a metabolism. Silicone is just the wrong substrate. You can model a brain to the atom however it will not begin to think just as if you modelled a kidney to the same perfection it will not begin to metabolise and produce urine. Only living animals can do this. The worlds best language models do not possess an iota of experience or consciousness. If you ask it to write a song it doesn't even know what a song is although it can describe what a song is, it simply accesses enormous libraries of what humans say a song is and then constructs the song from billions of examples that have already been created by humans or consciousness itself. It used intelligence but no inner world or inner monologue exist within the Ai.
youtube AI Moral Status 2025-11-13T04:2…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy--G3l4v4wdBDrdeN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxjVWGx-eqsWy1AwCd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxfyJzoP4g-3b1ApY54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwj_ncNPgfCZTiLHWd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgyzqP9uCO7iUhYttz14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzqmx9erBuVnwsMJcp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx9TEUQxojq26nNepd4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgymRBJgMGLu2ZrDc_h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwuG77G5GwsgIom9_t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxnTSM1SiX5jHO0oOl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"mixed"} ]