Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Only thing I can argue with is the whole "AI making up articles made up in fake journals" thing, because an emulect, as you define it, would be of a single conscience, a single large being made up of many smaller beings. It could only do what our own minds can do, and it can't do that fast enough to have any effect. It would be as smart as humans, of course, maybe even smarter, but it would also have the same flaws that our own human intellect has, just shown in a different way. So, honestly, an emulect could maybe be worse at what we use LLMs for in a TON of different ways, It isn't a program finding the algorithmically perfect way to structure something, it's a person painting pixel by pixel, no brush in sight. I know it's anecdotal, but I always think of consciousness as the singular being. The one stream of life that someone can go through. The one experience of life. Think of the conundrum of "When you teleport, your atoms are disassembled, and you die, but another, exact copy, is created in the specified location." That clone would not have that same consciousness. My consciousness ended when I died. If an AI were to, say, become sapient, I believe, at that point, it is also conscious. It is the singular being.
youtube AI Moral Status 2024-06-11T18:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyZX-6c3TbS4CCas4Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugwi1Ms13N8U_gVSPyB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzWNdY4FwBQoUCS0j14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwn-G_2ql-D-9Oee6h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw2TCbb-9xaUdmXaKR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyXyLk0Kcft6V46FWZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx3r-ZLheBX-HK3OFJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzFAsteUOX71yUD2Mp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyZFVurZwWb_aIsnPx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxCdtMz2uVPbTxGWh94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"} ]