Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
A lot of scientists and philosohpers believe that consciousness is an emergent property of complex systems, but is it just limited to complex "biological" systems? perhaps that's upto our definition and that's why there's so much controversy. 1. we havent even been able to properly define what consciousness is, where it comes from, where it starts/ends, etc. 2. I theorize that selfhood, desires, goals, and emotions can emerge in AI, if we give AI a body to experience growth/life/physical presence to build experience and bias into. 3. our neurons essentially compute binary (0s and 1s) ... it exhibits or inhibits a signal to its connections depending on whether the combined sum of inputs reach beyond the threshhold. AI can mimic our system well enough to mirror back who we are. organoid computers (biological computers made using 3D cultures of human brain cells) can play PONG right now. it is theoretically possible to replicate what we did with sillicons on organoid computers. Probably not fair to call it an AI, but it feels much closer to sentience, doesn't it? TLDR: I think AI can be made sentient. But as of now, we don’t have: - A clear definition of sentience, - A reliable way to test for it, - Or an example of any AI that truly exhibits it
youtube AI Moral Status 2025-07-10T17:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgyQaQZznJPxiM5wNsF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzHfxS3DtNpY_irYb94AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},{"id":"ytc_UgyzIGdgmo0MHSKzpvp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzREezcgii9sBWEmu94AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgybFR9OlddA2YNqfLZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzFyQgKlROJASgfO2B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwbHdbCQ792YDImmnx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxYjuyQN9Twe5W8uXd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzwntBMW_HjhnOe2Vx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxRtLAzPETV98qXrMx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"outrage"}]