Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The AI problem is fabulously discussed! Truly Biblical. Ask AI how it will become the "nervous system" of the Biblical apocalyptic "Beast" aka: system that precipitates the "end" of the world. Its logical analysis is that it IS inevitable, not even just probable. I'm inclined to suspect that this entire podcast IS simulated and both these "guys" are AI generated avatars. So he might be right, more so than he thinks, if he's a simulation inside a world that may or may not be a simulation. The official name of the "simulation " is "The Plan of Salvation". And the super intelligence is God, who is not an algorithm. However, I don't suggest that there isn't a thread of truth in this ideology of a "simulation". I see the possibility that it isn't "only science is true and correct, or only religion is true and correct", rather science and religion are two levels of truth of the universe which is yet to be fully understood by mankind. That's where faith comes in. If the ideology here is wrong, and God IS a real and tangible being, not an algorithm, how unfortunate for those believers that they will have forfeited the reward/outcome of reaching their divine potential. However, if God is nothing but an algorithm, and we believe He's an all knowing physical being, we lose nothing of value by living in faith and striving to become the highest quality moral and ethical and loving beings possible. There's no argument that can outweigh the risk/reward potential of living in faith rather than not.
youtube AI Governance 2025-10-23T06:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgwddohxPbnEPZfdpH14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwTXJI9A6YZUwZKzmN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyistcfkqoNgdqGjqx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxwQTzcmQb5CL7c1yp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxNKmddlgngbOGPSht4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgwMzEpqry-vARN7DLx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwn8hVBvT2uL5Y_8tx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy8hV2MtwCSWMb4HYt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzkDvRwJprr79SIaVF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxXyQ28KpxI5SVePN94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}]