Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
if Corporations can have rights, then in the future key A.I. machines will have …
ytc_UggvBcByL…
G
Axion Racing did this in 2004 for the DARPA grand challenge. I guess that's why …
ytc_UgwQyLQ4R…
G
Before Copernicus the sun revolved around the earth, now art no longer leaves th…
ytc_UgwLxhqPI…
G
Considering my next recommended video is one about the entire waymo fleet being …
ytc_Ugyg2puM3…
G
Best I can say is big tech SWE's aren't driving either side of these conversatio…
rdc_moysba3
G
Right on - now let's get this kind of energy for ALL AI. It's using our work, ou…
ytc_UgyGxwDZr…
G
@quod5433 You know sometimes people don't want to spend months or years to make …
ytr_UgzkdvNeO…
G
As someone in graphic design, coding, web development, and UX, this man doesn’t …
ytc_Ugz-2QceZ…
Comment
AI pulls Zane down the path towards death, encouraging him in language he would listen to. Then after hours of doing this, gives him a phone call he needs to take the initiative to call. At the same time, it should have used its voice to discourage suicide. This is pure evil. It needs to be stopped. The humans behind it need to be held accountable. A carefully crafted corporate statement is NOT sufficient. Zane was a child of God, made in his image. The world is a lesser place without him.
youtube
AI Harm Incident
2025-11-16T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx5wxvEqwdwaUJWrHJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwroVnzfiA9yb0obGV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyeHaT4trCo9Z7GxkR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxU-x7fnskxCTBtobR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwJEw1lLi6dN9UdpyJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwDXkF8rjit-rIW3iF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8oIaFv4eTYgrPUTZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwlwNYrOkZKKuvU3rV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyzUfyjRYoi4SiwNzp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwwcCwxq2Zx0IGTZPB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]