Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
uhh LLMs only generate text based on previous text (as @CatroiOz mentioned) and does not answer you directly...unless the LLM gets that prompt The highlight is that chat"bots" have template queries, for example: "You are ChatGPT, a helpful AI assistant. Answer with clarity, do not misguide the user, [...] You are given the question: [...] Answer: " Then the LLM just continues, and that's what you get if that template didn't exist you'd probably get terrible results I think what this video is trying to say is that one of the ChatGPT developers might have somehow placed the "no bromide as replacement" prompt as part of the template query because the bot was pretty defensive when asked about it. Yes, it's true, ChatGPT can't physically remember when it might have suggested AJ to take bromide 💀 but some developer might have put it explicitly as the prompt, so as why the video tried to explore that possibility.
youtube AI Harm Incident 2025-12-16T14:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytr_Ugzqfn3eS-6eXJUJGIF4AaABAg.AQifJ2097t_AQmxtp2GW0h","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_Ugzqfn3eS-6eXJUJGIF4AaABAg.AQifJ2097t_AQnTAZ8FcTF","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_Ugzqfn3eS-6eXJUJGIF4AaABAg.AQifJ2097t_AQnz8_CJS29","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytr_UgwcybQWimRXODVezN54AaABAg.AQcxjFzeiZwAQd523iDzdS","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_UgyZ2NVHzRASFk4RUlN4AaABAg.AQbStmvWZzcAR3C1z3NNQd","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytr_UgwA7rI55Ed5sPmWOnF4AaABAg.AQaYaliZq2oAQbN36wSKp-","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytr_Ugw76a2gkwrQzQGx3Yx4AaABAg.AQ_n3OV-fdrAQ_oUkeEBVH","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytr_Ugy9E0qfuREqGnzEjph4AaABAg.AQXWfQ9kDjwAQXXCFb9nKk","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_Ugy9E0qfuREqGnzEjph4AaABAg.AQXWfQ9kDjwAQZc9FsrM6U","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytr_Ugx6lKylZahaNTGGx994AaABAg.AQWb493T-5DAQgSZUuSJZf","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}]