Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What a paradox! AI or AGI, with superhuman ability, is intended or destined to replace all jobs. Assume the creators’ interest or their ultimate goal is to have the best quality products and to increase productivity in all sectors of industry—agriculture, engineering, art, science, etc. Human intelligence develops because of recognition, rewards, or similar reactions from other human beings or beneficiaries. But here is the paradox: if almost all of humanity suffers because of AGI, then only a few greedy creators of AGI will become the buyers and sellers of those products. They would be working to erase humanity rather than improving society. They are playing chess. In the end, a system built on such narrow control is doomed to collapse under its own contradictions. No society can survive when its creators undermine the very people it depends on. Humanity is not a disposable piece on their board, and any future built on excluding the majority cannot stand. So, in my perspective, there are entities from another realm who directly or indirectly have entervened in our world through completely or partially possesed individuals to destroy humanity. According to a passage in the Bible (Matthew 24:22, Mark 13:20), if God did not shorten the days of a great tribulation, no human life would be saved. The passage states that for the sake of the "elect" (God's chosen people), these days will be shortened. When I read these scriptures I think of GOD who has the ultimate power in the universe. I have faith in Him who ends all evils present now or to come yet.
youtube AI Governance 2025-12-02T10:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw-AX0RVmBY4ZZFSlN4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzadQjm93sEpxkTkAV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwOZOPP8OrTv5Tb2qV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwouQSDDrsQyYt_Uz54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx1VT21ZIIGbJPn1P54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz89833lUUpHcicJmF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyTsulXhNoCzlpVQ6F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyQfAaBkud45-G_B8t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxNwBBXyA85WOtIu-d4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxXrTgSSsgPuf4ofMF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"} ]