Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The AI's neural network is just an incredibly large number of logic gates or if-then-else blocks. They are wired in a very smart way and they are a lot but in the end it's just that - a bunch of logic gates. These things do not have feelings (no matter how many there are), they just simulate them, because they were trained to do so. The more logic gates there are - the better they simulate feelings, but it's still just a simulation. That's very different from being sentient (which requires an actual soul).
youtube AI Moral Status 2023-05-14T21:1… ♥ 5
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_UgyDoFrruB1y9qrHNNx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy5ljrW91BGqcwVlj94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy3O4jJhfAbMmR5J254AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugyc71LU0QhVDB8ZHMh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxoIwYWyPWMp3rfL1B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"} ]