Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it comes to a point with these ai defenders where i'm like "yeah yknow what you'…
ytc_UgykPsC9Z…
G
So further proof there is no such thing as AI. It's all just computer programmin…
ytc_UgzSGL0me…
G
You guys have seen way too much robot movies, wake up. They are not gonna kill u…
ytc_UgzAcU2PZ…
G
sh1t video 3 mins are enough to know this video is just to get views...Ai is no…
ytc_UgwaH9KTS…
G
@私は神を信じます I'm telling you as someone with 14 years of experience as a mechanica…
ytr_UgySRfSi9…
G
Knowing a bit about tech and design, that AI button was most likely placed there…
ytc_UgyxQZcNg…
G
Just a friendly heads up; The stock content you are using at 06:20 that shows an…
ytc_Ugz_7ScVM…
G
It's almost like you don't need AI to do art and people who use AI to do art are…
ytc_UgxJTWLh-…
Comment
I feel like the AI isnt at fault here, because the patient literally rigged the results to have the confirmation bias he was looking for. He WANTED to starve his body of the chemical it needed, it was always going to end up in the hospital.
youtube
AI Harm Incident
2025-11-25T02:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwb2y1ZHK3RiFS-4jx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgznJ_7zXkpthK-JWQ54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxIuvaHFCYH-O0bXtp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyaVZS44NOYSlBfu5p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyK0_bZ8qOBrR3H-HB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxO3LSrJlkOWZHcN-Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwR68zZjgweFDw_tVd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyjR_qcClpYhNKYY_Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyUouiBNq1kw1bx-7p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxHglo5rqCEW7wXSQB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]