Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm going to tell you why the predictive algorithm known as ChatGPT is never going to not hallucinate! Here's a brilliant test right in your question the exact same thing that it was supposed to type in three sentences ago. If it has no clue what it is IE it cannot perform a deja vu program then we understand one of its biggest problems people comment about short term memory loss that's when you don't realize you're repeating yourself not short term recall which is a completely slightly different thing and yes I know that sounds strange but I'll explain it very carefully: do you recall what you had for breakfast just before having lunch technically that is short term recall! The problem with chat GPT is it has no recall. Once it's put to paper or I should say put to text it immediately forgets it but remembers what should come next in that paragraph hence why it makes so hallucination table constructs. In other words we've hardly ingrained it the rules of grammar, and we've also programmed into it the language model of us human beings anyone who has taken grammar knows that this is a fallacy! By the way I look forward to you arguing a case against a corporate CEO when they tried to use chat GPT to do the entire show, including using a programmatic algorithm to do editing and also acting because the actor strike is currently going on now!
youtube AI Responsibility 2023-06-10T13:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwfdJOltzttgOBk9yp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgzDP5cMIxVWHMTRcS54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxtQol3t_RIjlAIiSF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy2_b2IwQYFYQNbtmF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwIlwB4TJBGCPVWzyd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwuT-L0MCPlAsqc1jZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy0P0Tm3FGiEdhRNxV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzyiDqJgQKHvQdK2wJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy4rrm_owLRoLVALIt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgznD4DPA9Z9uUeLN3l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]