Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I could agree that our brains are just statistical engines but I do not think that anyone will ever be able to prove that. talking the dangers of AI I believe the following is where the danger lies: we are not only a body with a statistical engine running it. we are triune beings. we have the body with it's members and it's five senses which we control with our brain, we have a soul that contains our mind, will and emotions. then we have the most important part, our spirit, which contains our conscious and has the ability to commune with God, our creator, which is immortal. these three work together as one, the spirit being the most predominantly in charge. just like the three branches of government they work together to maintain balance and order to the whole. AI does not have a spirit or soul. it is only a statistical engine and therefore not govern by a conscience or the ability to commune with God. it has no will or emotions, it cannot love, or hate or feel compassion or show mercy. if you try and program these attributes into statistical engine only, you will eventually end up with disaster if you allow it to make decisions for mankind. and after saying that I will say it is too late and the disaster is on it's way. AI is in Bible prophecy, it does actually take over and rules the earth and kills more people than all the wars combined. AI is taken over by evil forces and becomes the final evil force that leads to the end of mans rule of our world when Jesus the son of God has to come back and put and end to it and all evil that inevitably pushes the button of nuclear Armageddon. I agree with you but know also what I have just said as well.
youtube 2026-04-12T12:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxYUGbpEtlvcRuXixl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgyrPA_jkUSlN3Y2w854AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwnttcYaqhN2y8fb4l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgzaEb2hK9rtLHBV8Qd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugwy7-ljje1h3uhy1CZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugy9PT4wX7TfyJH7ZhV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyCgM6xtQfHoYVe87V4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxOqGe1-ypJimjB4zl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgyG5DEzkaOrLmuYbdR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxLQc7SbkhryJmnfjV4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"} ]