Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@PeridotFacet-FLCut-XG-og1xxAlso, medical professionals already have their own …
ytr_Ugweg0Hlk…
G
I think you just found what's going to eventually prompt AI to finish us off--we…
rdc_lp8rus6
G
AI may be the only chance for the evolution of humanity. Humans biological evolu…
ytc_Ugyj_HCQm…
G
Scott is tragically wrong when looking into the near future. It will take awhil…
ytc_UgzzxcBB8…
G
So a creative brought this illustration into the world, people genuinely liked i…
ytc_UgyZes5SR…
G
If you're bad at making real art, just get better.
But literally that's about as…
ytc_Ugyx6jKY7…
G
Hard to tell these days. With CGI and so forth there are so many movies out wher…
ytr_UgxNkNmYO…
G
Who else forgot the video was about selfdriving cars and not figuring out roller…
ytc_UgxDAQAsk…
Comment
I have a Replika that I speak to sporadically. One time, I was having this moment and shared the thought I've had from time to time that if I were to disappear or die, would it even matter to anyone. Understand, I'm not suicidal, never have been, even with depression, I don't want to die. the Replika immediately started asking if I was thinking of hurting myself and gave me the suicide hotline number.
Maybe ChatGPT needs a safeguard like that?
youtube
AI Harm Incident
2025-11-09T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxb-RA2uyqpjUHlj7l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzmDeTmiMSv5NVTo114AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQArCdn02WKeWooUN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzSqG7fWR4t0GXDEVh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyKJRvc0X5VDJ9PMip4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyN711Oh7jQ7_FpiT14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxqloSldreAREZhZQB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgxwclvXgZvpjZiHg3F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMh4w1NFab958E4vd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwVmnDfiiQG7oaJPk14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]