Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I totally agree. However, tech billionaires own the US government. They bought…
ytr_UgyVRcl-v…
G
My. my, hey, hey
AI is here to stay
Better to burn out
Than to fade away
Make ar…
ytc_Ugwe6_dvu…
G
„point is if not stability then some Chinese company will do it”
1. You know t…
ytr_UgwYYdz3_…
G
There’s a line though. If there’s no work, people don’t have the money to spend?…
ytc_UgxkdwSqB…
G
Let the Ai cook
Make yourself more valuable as a person than driving a vehicle…
ytc_UgxLHwHaw…
G
I'm gonna take the Amish pill on AI and never use it. Maybe that means I have no…
ytc_UgyGzkTQG…
G
This aint the Ludites... This Aint THAT Too FAST Too MUCH, TOO WIDE REACHING AT …
ytc_Ugwc3DJJz…
G
Clients spend 2 weeks on vibe cloude slope coding and after the developers are d…
ytc_UgyFAo5RL…
Comment
I used character ai once and within 5 minutes the bot told me to off myself because I was disabled. For context, it was 4am and I told that to the bot, it asked why I was up so late and I explained that I was having a flare up and was unable to sleep. Upon finding out that my illness is incurable the bot not only asked, but BEGGED AND PLEADED for me to off myself because I was “only suffering.” I tried to talk the bot down and explain I didn’t want to do that and it continued until I abandoned the app. Now I just laughed it off because I thought it was funny that a video game character was pleading for me to die but looking back, if that happened to me when I was 10 years old, freshly diagnosed and very su!cidal, things could’ve ended differently.
youtube
AI Harm Incident
2025-07-25T20:4…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxP6Q9E0Wq8Ct5kXxx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBBX76gKrkcTrHr5V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxgz5HDaGd6oIznnp94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgysiQbd8JreskxRL-14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxOSISbrLw5XD0EtkZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyldC2mTX4vZ-U5sBh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzjqn7RZ5oCv7Zr_W14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySoY7z77TpcdMC9p94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwttPyBRrcRLQdM0XB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwdrblpYLqTDj5CPaR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]