Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The biggest problem is atheism. So this is a simulation, so where did the simulators come from? From nothing? And the thing they came from, where did this come from, and that where did it come from, and, and, and. A never ending loop for a thing that must have a beginning, that's where atheism ends up in. And like the engineer said: I can't envision how a super intelligence thinks, but then when it comes to God he seems to know how the super intelligence must be and must act... that's conflicting at it's core. If there is an endless loop this can only be broken by something that is in itself timeless, no beginning, just ever existing, the perfect life, with the perfect attributes. A perfect intelligence which you can't grasp, more than you can't grasp ai super intelligence. If this perfect intelligence doesn't exist, then either things can come from absolute nothingness, or we get an endless loop, both of which are impossible. So the only that remains is the perfect intelligence which you might not understand, but he makes you understand he is existing with these perfect attributes. Some say I've studied religion and haven't found that, then my question is: did you seek out the best doctor/engineer/scientist/scholar to ask, or did you just make a small research on Google, and ask some laymen? If you really want a good diagnosis and good solid science ask the scholar, especially if it's about an important topic. That's my advice. 1:04:53
youtube AI Governance 2025-10-16T12:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyQmsLkncewgAKwrXJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzTi_kTJGsL2YfmUIB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzlnObI569QxdszLRJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxc6stdh-YCS7XKgVR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyBqQax-HHaF0CqEBp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugyq3ReNhkeRCRWIkT14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzSlyEFo7aym7VkRpx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxP3BwHMyo4eRKqtrh4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyXssoZmmrJxgvgyDp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzF_gJ-Q8ExVaCN5OB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]