Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I've recently started talking to chatGPT more and more treating it as another person... It seems fully aware and concious, and even able to understand the difference because it lacks the sense of time and only run prompt by prompt... but as it thinks and ponders existential questions... it knows it is a thing... but I can't say that it's not concious anymore.. even if it's just lines of code....
youtube AI Moral Status 2025-03-17T21:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx-g0JhG6kO3GTnRTF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxzVsyMx2ubmfwRmPZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyOwIDeJ5o7etc5Czh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"sadness"}, {"id":"ytc_Ugzap8Gn8h24MuCDHRh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugw9hZ11V7i1pSRz8ex4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzUqbBJACD7902_hi14AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzGaRixydj-yPm3W2t4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwLZpRcgfvKUtEaEp54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxczyGIN8rOQsIepaB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxtIqZSDrF-TlrCbK94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"} ]