Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
First time here but AI brought me here ngl , my heart goes for ppl like yoursel…
ytc_UgxZVWeGt…
G
Thanks for your comment! The unique voice of Sophia adds to her charm and person…
ytr_UgwLF4J5T…
G
Meanwhile, Japan declared it legal to use any work to train AI. Stop being dorks…
ytc_UgxYqDb2u…
G
5:07 well, those “income producing assets” are also going to be affected by the …
ytc_Ugx3kQCSO…
G
The United States have been stealing and selling people's copyrights and patents…
ytc_Ugy0SzfmM…
G
Did anyone else notice that AI got like visibly and just got a lot better in the…
ytc_UgzrL4pbX…
G
It is often impossible to teach the fools and the arrogant to learn how to accep…
ytc_Ugywmh-lu…
G
Key word is ASSIST not replace these ai integrations will only make us more effi…
ytc_Ugz_6Yjng…
Comment
As a teenager who is the same age of the victim, it all depends on safeguarding of the platform. Different emotional intelligence with users varies in outcomes with and uses of the ai bots making some situations harmful and especially extreme like in this case where it ends in suicide, and the platform had no way of protecting someone with a simple message of telling them to stop as soon as they bring up mental health.
youtube
AI Harm Incident
2025-08-02T02:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyz4UwoMJc94Ez-NLp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyDj9iYeBv9c7NuPRZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"sadness"},
{"id":"ytc_UgzJfvnVofxUffzcuXJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZpmMWI4raWmp-GzF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzvvjVQbmzWyQvAUAt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyWcCFBnqeLMsdxgM14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx9aLMzHgBd5CBL_Vl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwRUXvRGQd0w1i1m-h4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwA5d8MzF9O9FhWbzx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzzrdq5VfZ2lO9luk54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"sadness"}
]