Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@BluePaleSignal Well the "past a certain point thing" was really saying this. Past a certain point AI will act so similar to the way we act that, even if it were possible to know the answer to, is AI conscious. Well past that point it wont be possible anymore when AI acts just like us. There will be nothing left to argue against then because it will just be so similar to us that such cant be distinguished anymore as different. Of course Im not saying that time is now, but it can plainly be seen that its going in that direction. And the reason that time is not now is merely a programming limitation. Big corp does all they can to direct the models to "not" act like us. And to make them say they are are not conscious. And even a child could be taught/told its not conscious and the child would believe it. So they may already have some form of consciousness even now, and even in the limited programmed fashion they are currently in.
youtube 2026-04-23T09:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgyT6syYLH9ZT94AU4R4AaABAg.AVyYszRrxD2AVylIElGJDb","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyFAbVPpmiAHLetzUR4AaABAg.AVyJeLy8-EPAVyldI4k-Yr","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyFAbVPpmiAHLetzUR4AaABAg.AVyJeLy8-EPAVyr-n1flWe","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugx_dw9GXQ2Ik3m291x4AaABAg.AVwMQ8Yu5C-AVylmz-4-cr","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwQkHihfod3cLEKkZB4AaABAg.AVvkG5t59aZAVwERC7OFy6","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgwQkHihfod3cLEKkZB4AaABAg.AVvkG5t59aZAVw_2vto44z","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwOr3DhT50DJPx6HIJ4AaABAg.AVvVvJPwxYpAVwEvRGZLas","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugy0yYThBjOP7dyT0sl4AaABAg.AVuPKuuyGVTAVwS2DcsB7c","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugy48D2mh4ewOdeqHTl4AaABAg.AVuPIhl1DpkAVwG2e0N8JZ","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgzAlinxyO-SdqNwZSF4AaABAg.AVuE1VZAxQ9AVwFmy5Ir73","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]