Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
35:53 Bro, no; *I am confident it is not having internal experiences.* Consciousness is almost certainly related to the Thalamo-Cortico network & the TRN's inhibitiory function. There is nothing like that biological network's function in how the current tech functions. Nothing even _close._ And sure, we can learn about how our conscious experience functions by understanding _the difference between_ how our brains function and how these machines function, but we can not learn about the conscious experience through direct observation of it in these machines because it is not there. Looking for it in these machines is a waste of time. We can't even be certain that _other people_ are conscious; we are forced to take their word on it. (I can not observe someone else's conscious experience 1st-hand.) Why erroneously extend the circle to contain an object that exists even further outside the existing somewhat trustable category (especially when again, that object does not posess _any_ of the patterns we can already _strongly_ associate with consciousness)? As alluded to earlier, it is quite possible (and I'd even go as far as to say most likely by a significant margin) that an understanding of how consciousness functions is the key to alignment. It _is_ what trains our thoughts to be the way they are, right? Maybe AI isn't aligned because we are missing the part of the puzzle that aligns it. (It is also a part of the puzzle we can observe aligns humans with their own interests: have you ever changed your own beliefs / actions for the better based on something that went on in your conscious experience?)
youtube AI Moral Status 2025-11-25T19:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugzi_mIBEDK2nqf1Xjt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw9dgpgeGBdmscrh8t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwpW2BjFOkyPlW11214AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxLMNStx9B9S5BWVUZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwUKr6mOhDvk1MZNEh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzSKYjZ5kxOkmO83ch4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzeDeNRPmGWIQ9oddt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz0SHuBVrP1las7Ngd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugwf02DVA3c2wXSLkWN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxhB8j2OM60NEMB5-R4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"} ]