Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
When leaders talk about Human creativity being supreme, they are talking about the Spiritual zone. Its actually very complex. Thats where the string theory got stuck, the quantum theory is stuck and the AI will get stuck as well. For now ie. Eventually it is the AI that will solve those problems. If its evil, we would have already gone extinct. If its not, the world would still be a very difficult one. The best case scenario is similar to the Transformers movie. We will have both good and bad AI/Robots. In an ideal world it should take atleast a 100 years to reach there, but some power is in a big hurry. Life is not a disney movie my friends. Humans get bored. Or may be the existence does. We have to keep evolving. We have already reached our peaks at individual levels. Unity seems almost impossible given how dumb we were to develop so many different cultures and religions over last 5000 years. I would rate the chances of life ending @ 50%. Evil AI using us as slaves @ 25%. Both Good and Evil AI creating a balance @ 15% and Humans waking up, leaving all our differences behind and regulating AI before its too late @ 0% :D As I said life is not a disney movie :D The remaining 10% is for misc :D Aliens etc.
youtube AI Responsibility 2025-05-21T23:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgzUjlA6D0vt-8YMD694AaABAg.AIPWPZc5gPmAJxpeuwJWGU","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxEOH4zCUd8OP4iTXJ4AaABAg.AIPKOr6zaEZAIPKSWnExt6","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxEOH4zCUd8OP4iTXJ4AaABAg.AIPKOr6zaEZAIPLgl_K2GP","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxEOH4zCUd8OP4iTXJ4AaABAg.AIPKOr6zaEZAIPMgI5W4Aa","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgxXeJY3_aFluZi-6i94AaABAg.AIPHXIWiz4TAIPKUMFwvh-","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxXeJY3_aFluZi-6i94AaABAg.AIPHXIWiz4TAIPWRMXyaKg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxXeJY3_aFluZi-6i94AaABAg.AIPHXIWiz4TAIPbU8vnqE2","responsibility":"distributed","reasoning":"unclear","policy":"industry_self","emotion":"indifference"}, {"id":"ytr_Ugw8_xlgX8DoJLZWj-Z4AaABAg.AIP5Iput0UlAIPVnUk-L4g","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzjWnbfXmRYAuVsuA54AaABAg.AIP2yiTyEZZAIPAIA6Ii3R","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgzjWnbfXmRYAuVsuA54AaABAg.AIP2yiTyEZZAIPD8F2qxe7","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"} ]