Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Fun Fact, you can actually get a GPT of it called AI Hummanizer, its the number …
ytc_UgzVKpjGH…
G
Are we sure this video was not created by AI to see our reaction to the idea of …
ytc_UgiACXM3r…
G
Open AI is a joke… it’s like playing with sticks and stones while others have su…
ytc_Ugyph7aot…
G
When we LIKE talk about LIKE an AI LIKE goal. Bloody hell Alex you sound like a …
ytc_UgxeBw_Zc…
G
Love that they used a song from Detroit: Become Human 😂 This stuff gets crazier …
ytc_Ugzqeet9o…
G
"The current academic 'war' against AI is not a defense of student learning, but…
ytc_UgzEYA4f_…
G
I think AI has the potential to tip the scales towards democratic socialist coun…
ytc_UgxNJqGUH…
G
Its funny that the people who are vehemently against robotics and AI here are al…
ytc_UgyIV42g6…
Comment
Here's what I want to happen from this. I want Sam Altman and the board and everyone who helped design the thing, they are all responsible for that guy's death. So they all need to be charged with accessory to murder at the very least. Just like that one girl who egged on her boyfriend and got him to commit suicide, chat GPT and open AI essentially did that to that man. So those who created it, they are ultimately responsible for it. They need to be charged.
youtube
AI Harm Incident
2025-11-08T01:2…
♥ 40
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwHWmKVArrbBzNSDjR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwFlJ4ZAsJf5spd9il4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdECtLb4JgAsb4IGx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwnsHj2UryVRe1jTNp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugys0TIGpgjHPPXCit14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxB_JDqFtoY8ForzF54AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxMbvndbrGSxaWtl5d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxRyB2HyjZMmt_-XXl4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwRMMi4xtxCxLq4pIZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzDi5uz-uMjn3iE8fF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]