Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Will AI have a soul? This might be the great filter that hits all advanced world…
ytc_UgxtMi-kt…
G
Do you really thing that an Ai can create a scalable application? Try and good l…
ytr_Ugw36l4ei…
G
Investigating welfare fraud is racist… We need more Somalians for more diversity…
ytc_Ugyzwo2xb…
G
@athmaid Exactly. It is for people like you and me to generate little pictures t…
ytr_UgwMt4VDZ…
G
About the deepfakes, good luck convincing Russia to extradite Yuri because he ma…
ytc_UgwOhuo4T…
G
i wanna do this now
not for the sake of posting art but for the sake of ruining …
ytc_UgxH5z3mh…
G
I think there's a difference between a.i. and consciousness. they MAY have creat…
ytc_UgyCd1vqm…
G
If you mimic the AI artist and give them a lot of attention, doesn’t that mean y…
ytc_Ugw9VOUe_…
Comment
ChatGPT needs to have safeguards, stop guards, something to combat suicide. Something to stop or at least talk them down from suicide. It sounds more like a suicide machine instead of help. ALL AI chatbots need to have suicide prevention built in. This is just sad and yes, all the tech bros need to be held accountable for this BS!
youtube
AI Harm Incident
2025-11-08T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyU1qAcZLt9XIEoyTR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJpNpSQvJheGPl1dd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgybTxE74saEuWog7mJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzv7yXPLf4Scbi1nLd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwA4rmXetM1fVgyMnN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyEGfz1ZYgkA1z3Tfh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxKvQ5hwnOpYrkURj14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwS7G_IXTvrJLKgzhl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2NKP57F5KAjyGJ9d4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwaaN4aujVV9dLjZ1h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]