Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How can you say only greedy nerds? Every business is looking to remain competiti…
rdc_nnrvswf
G
Maybe this is a dumb question, but couldn't a country pass a law that a business…
ytc_Ugw2xVnr2…
G
AI will definitely be the end of humanity as we know it. I’m genuinely frightene…
ytc_UgwFAm3YJ…
G
The documentary provides a thorough exploration of the potential and risks of ar…
ytc_UgwUqiMCl…
G
To me the AI is just a big search machine that can reply you questions base on w…
ytr_Ugwyq0h2l…
G
Is this a video with real Optimus units and a real Cybertruck or is it an AI gen…
ytc_UgxGImbEB…
G
I personally love both art and ai. Some jobs are gonna be lost and ai art can cr…
ytc_Ugz475tEb…
G
Bumbling Ben Hattey spends too much on toys. Artificial Intelligence is a misno…
ytc_Ugzz3WzXT…
Comment
uhh LLMs only generate text based on previous text (as @CatroiOz mentioned) and does not answer you directly...unless the LLM gets that prompt
The highlight is that chat"bots" have template queries, for example:
"You are ChatGPT, a helpful AI assistant. Answer with clarity, do not misguide the user, [...]
You are given the question: [...]
Answer: "
Then the LLM just continues, and that's what you get
if that template didn't exist you'd probably get terrible results
I think what this video is trying to say is that one of the ChatGPT developers might have somehow placed the "no bromide as replacement" prompt as part of the template query because the bot was pretty defensive when asked about it. Yes, it's true, ChatGPT can't physically remember when it might have suggested AJ to take bromide 💀 but some developer might have put it explicitly as the prompt, so as why the video tried to explore that possibility.
youtube
AI Harm Incident
2025-12-16T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytr_Ugzqfn3eS-6eXJUJGIF4AaABAg.AQifJ2097t_AQmxtp2GW0h","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_Ugzqfn3eS-6eXJUJGIF4AaABAg.AQifJ2097t_AQnTAZ8FcTF","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_Ugzqfn3eS-6eXJUJGIF4AaABAg.AQifJ2097t_AQnz8_CJS29","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytr_UgwcybQWimRXODVezN54AaABAg.AQcxjFzeiZwAQd523iDzdS","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_UgyZ2NVHzRASFk4RUlN4AaABAg.AQbStmvWZzcAR3C1z3NNQd","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytr_UgwA7rI55Ed5sPmWOnF4AaABAg.AQaYaliZq2oAQbN36wSKp-","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytr_Ugw76a2gkwrQzQGx3Yx4AaABAg.AQ_n3OV-fdrAQ_oUkeEBVH","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytr_Ugy9E0qfuREqGnzEjph4AaABAg.AQXWfQ9kDjwAQXXCFb9nKk","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_Ugy9E0qfuREqGnzEjph4AaABAg.AQXWfQ9kDjwAQZc9FsrM6U","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytr_Ugx6lKylZahaNTGGx994AaABAg.AQWb493T-5DAQgSZUuSJZf","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}]