Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Hey Steven...Watched all of this podcast last night & greatly appreciate the thought provoking consistency to the content of this podcast. In a world of endless BS coming out of mainstream media...this is refreshing! After studying eschatology (Biblical prophecy) for the last 40 years I couldn't help but draw parallels between Roman's predictions & the 27% content of what is foretold within our Christian universal play book! This species is destined to return to a farming based living, which totally meshes in with the 99% job loss from AI. Everyone still has to eat no matter how technologically advanced we are. What is seriously scary is that when AI does take over, there are 2 major events that will unfold on this planet that will result in the reduction of the entire human population by 58%. To Super Intelligence, humans will eventually be deemed useless like Roman mentioned & treated like cattle. That also includes EVERYONE being micro chipped like someone's household pet. Refuse & the system will reject you as a contributing & worthy citizen. To me the biggest & most important factor when it comes to AI is WHO are going to be the people that program it. Are they ethical? Are they moral? We're all dead a lot sooner if they let personal ego & a lust for world dominance & control unfortunately be at the forefront. Matthew 24:22 always comes back to haunt me when I wonder how this planet will turn out.
youtube AI Governance 2025-09-13T03:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyCzeo-UetBvEy0c6t4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxUDtZh0eLK_RaxiCx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyLkpcBr1Aq6PWCO2V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgycQplRI7oEpechVjB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugw3YVYQkCH4CLbOEo54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxERHC6zmHBVZXsH9p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxjQOZfMy5RSr0ya554AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzCJDUCc-rLM01YBhN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy2yVrzIM7BCt5MNll4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxTag6NulalBR7qJtx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]