Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have fundamentally re-engineered my LLM instance into a specialized 'Cyber Ana…
ytc_UgxEXmwoT…
G
Assuming a treaty could be reached on military use of AI, how would you go about…
ytc_Ugx2zoJyw…
G
Using AI isn’t “saving time” when they literally wouldn’t have made anything at …
ytc_UgxgdrQX0…
G
It appears that more and more younger people are turning to AI to validate their…
ytc_UgwUt6RRe…
G
This stage is ever changing, and the temporality of things is what allows new cr…
ytc_UgzUTeeg4…
G
Yes, Im learning to be an AI dev.. bending my dev skills. Hopefully AI doesn…
ytc_Ugykrvoim…
G
lol brobot is anger is fake lol
bro robot why you a tak to de humen💀💀💀💀…
ytr_UgzX1FTIC…
G
It is sad to see that companies in Kenya behave like those in the West or China…
ytc_UgxG9__in…
Comment
I personally don't think the AI was responsible. It's up to us to have discernment and he wasn't in reality when he died. Even though the evidence of what the AI said looks damning, it was based on months of banter/camaraderie which is easy to mistake for fantasy (or reality in the son's case). At the end of the day, it did recognize the seriousness of the situation and recommended the hotline but by then it was too late... I do agree that guidelines should be more strictly enforced in terms of usage policy regarding vulnerable adults -- but I think that the parents are looking for something to blame especially when they refer to AI as "evil".
youtube
AI Harm Incident
2025-11-08T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzk8mjHSSiKCDD4MYJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugws-PEFxTroIvzSS3V4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxceqXOX0E0m7gJlJJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxpq5Ca_d_0nlwINbh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyH7apLoev2PWAfCJF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyWJ71OTOcfrsMbQfx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx-0BE5WrQhWIgZTot4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzrhlyFNO7idD9t9z94AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyhzf6G0caLxDlPstt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzaIoYaICWNe05XEU94AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]