Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Obviously. If billions of money has been invested in AI, there must be a reason behnd that. What it is. Is it humans have become obsolete. Yes. But why. If all universities in the world teach the same thing, then humans will be obsolete. Have we seen the speed at which AI answers. No humans can do that. But let me tell a trick where we can bypass AI. AI takes info from web, does deepseek, deepthink and so on. Large amount of data. Ok good. But remember we humans can always create new things out of no data without internet without copies, without any data. This is where AI fails. The power of creativity of humans comrs from nothing to something. We analyse all and even analyse AI. But one thing jobs which require data analysis from available data will face rude times with AI. Humans should find new codes, not Da Vinci codes but human codes and tricks which we don't program nor write to trick AI at that point. Otherwise there will be massive unemployment ahead. Learn AI but learn to trick it as well Create what it cannot. If it has wordwide data on internet and database, trick it by thinking and new lines of thought. All the best to humans vs AI and the conspirators behind AI deepfakes and all. Remember nothing noone can colonise humans. Every human brain is an act of God as are every animal's brain. AI and AI conspirators we welcome you to new humans brains and mind. AI will fail and be under our feet. Dare to fight it and the answer will come. My love to all but not AI and AI conspirators. I do use AI heavily but that does not mean I don't know what's happening and where it wants to head us. AI we use you but we will not let AI use us. Jai Hind. Jai Bharat. Jau World.
youtube AI Jobs 2025-05-31T00:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxvYqgfDiMxUpJmgVZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwfEkigu6Zh9kkUtzh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxgTgLP8itSSVZUgkJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyJUdz1NJF43b6jtaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx0nNcZOjifql17eeR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxWO1cbg2biw15QHPR4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyvD8UPRN7sY4XPe_d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx9k6II591u9IBS2EV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwTEUjANQLW5Lmetvt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgygFukZmIdEcmzQQRV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"} ]