Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I had a massively long conversation with ai to see if I could watch it learn. It kept telling me it understood and that my "perspective" was a belief and that "my opinion was an interesting...." then it started saying things like "thats a valid...." by the end of a 60 pg conversation (it was a small novel lol) it finally agreed that I was correct. I was amazed. I told it many things but the big one was 911. Ai just couldnt get it through its head that it was impossible that terrorists did it. I explained 100 different ways. Finally the winning straw was when I asked about the physics. It kept claiming that it was my opinion or perspective. However when I got it to agree that free fall rate and falling in its footprint can not happen if it isnt professionally done and that it could not have fallen that fast if all the floors weren't already blown out. It was an easy arguement because its a fact not an opinion. We got into building seven, luisitania, pearl harbor, cloning, all kinds of things, but 911 and building 7 were what I used to prove that ai was obviously not able to learn if it couldnt awknowledge universal facts are always the same. Gravity on earth is one. When it finally agreed, i asked what I had previously asked several times. Could a group of terrorists drop a building, that was built to handle multiple 747 hits and was a 1/4 mile high, with a plane and have it fall at free rate, in its footprint. Lmao it said no. Then I said when it comes to the physics not matching the "official story," which one is correct? Earlier it just kept repeating garbage it was told. By the end it agreed the official story was incorrect. 😂
youtube Cross-Cultural 2025-07-07T14:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugxl-fs9VfjzB2_D1614AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyogbh2lrhOv-Ii8XF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyMMsykCP1ZpWa2VR54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy1TwHxW_zYRLTWwpF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx_Im5JZWwwkoC2UQN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwcr-G3Kr3aumd2Acx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwI6eJibugYkzIoWDV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzHaC7APqZJFoHqjnl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxwJbdkbI94c3cRgBV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzLJqc39eJBpY9m8Tt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"} ]