Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The fact that now Sams is facing lot of controversy because he's using AI its am…
ytc_UgzSVHqJX…
G
One question, though. Have you actually asked ChatGPT about any of this yoursel…
ytc_UgxLCjxyK…
G
(Facepalm)
People willingly giving the ai their art for theft... jezes... I didn…
ytc_UgyBAMODe…
G
The report and your video are premised on AI's out performing humans... AIs do t…
ytc_UgxM6wpuq…
G
😎🙌 if you talk with my chat gpt its a better than human 🙌 better than your artif…
ytc_Ugw0HhRTS…
G
"use Ai to create their own problems and then use Ai to test the results"🎉🎉🎉…
ytc_UgyPjjJMN…
G
That's something that I did find interesting after using AI art programs for a w…
ytc_UgzQv7RrN…
G
I'm not an artist but what pisses me off the most is when people unpload the stu…
ytc_Ugy7foojd…
Comment
Indeed the dose makes the poison.
You can die from too much water. You can die from too much oxygen. But both are essential to life as we know it. So consuming them in normal amounts is the right thing to do.
Sure, too much chloride is bad for you. But so is too much bromide. And the level for bromide is much, much lower. There is a reason table salt is sodium chloride and not sodium bromide or sodium fluoride.
A similar situation happens with calcium and strontium. They look very similar to the body, but a big part of strontium one might encounter these days is highly radioactive Strontium-90.
But yes, the reliability of AI chatbots is questionable. I noticed that when one suggested me that a 120 Hz hum in my room was a result of the mains frequency. It even correctly noted that the grid in my location runs at 50 Hz. Therefor the chatbot said that 120 is twice of 50.
If they can't even do basic maths, they surely aren't capable to give medical advice.
youtube
AI Harm Incident
2025-11-26T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy0DmCmdbe-BfbGEIF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_fZVUyWAQjnww2Rt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0LvYWn3euTNMqphp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxfVhMuEfzwzpNhu4Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyd0ZknLHqNGkF5Fuh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzn2mILCNXju5-IEwN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwgzyQfe96fAlDNXop4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwG9HnHQona2FTQMNZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxqhIXJky1u-Jy6crJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxkCyIppitIVtTIqnB4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"amusement"}
]