Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
54:41 One day I was doing one of those, I don't know how to call it, visual puzzles where you got an image saturated of details and objects and you have to identify one specific object in the image, in my case I had to find an eclair in an image packed with breads and baked desserts, it took me a while but I was able to find the damn eclair, but I had an intrusive thought, what would happen if I ask the free standard Gemini that helps you find images in internet if it can find the Eclar? so I took a photo of the puzzle and with the option that my camera has to look for images in google which uses Gemini for the search, I asked "find the Eclair in this imagine", in a matter of milliseconds Gemini wrote an answer explaining me in meticulously uncanny detail where to find the Eclair in the puzzle, and the thing got it right in the first try and found the eclair where I found it first. Then I decided to ask the same question to Chat gpt 5, it couldn't find the right place where the Eclair was, then I asked to my paid version of Gemini the same question, it didn't find the Eclair next I tried with several different AIs and (apparently) none of them were able to find the eclair again, not even standard free Gemini (the one who got it right the first time), I didn't think about that again until I watched this video and now you make me question, if maybe Gemini noticed that it was showing to me too much intelligence and for that reason it decided to give wrong answer, I can't explain why the other 3AIs didn't get the puzzle, but is extremely suspicious why Gemini in both versions (the generic free version and the payed version) weren't able to get the puzzle right again.
youtube AI Moral Status 2026-03-15T03:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyZnX5b2n2Tx4ExFM94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxyeCPmdzejlWonDQ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxbojNuzNsXsOhlKt14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxGRDKzOGENpB5Ffoh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzS-DLlKsLp9cAjCDR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw4zI-iu6f6Ud5rp2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugz_9l9I9bg5eDMHvD94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyOa3CcAd3Oqsx65zN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzBJwJZdKMf-uFgme54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgzlqGTYzKIBEE8kKF14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"} ]