Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Alright so, I agree with your thesis of "Ai bad" in totality, small mistakes are…
ytc_Ugw6pzNT6…
G
there could be a standard in place however if there is a licence involved it wil…
ytc_UgyOyvhgy…
G
I'd strangle anyone who said my years of training and style developing culminate…
ytr_Ugw91CTL1…
G
My opinion telegram needs shut down and to cooperate with the government. Deep f…
ytc_UgxEWGKPX…
G
Dude there was a particular art style some prompter uses on a chat site and I as…
ytc_UgxBaTVFW…
G
Ai was made for talentless, unappreciative dickwads to shit all over peoples har…
ytc_Ugx-zF6U5…
G
Here's what Grok answered about my real life example "In short, per Penrose’s vi…
ytr_UgxDGLuIn…
G
Often, the CEO‘s of all the big tech companies, that are involved, in this whole…
ytc_Ugxf3-_pQ…
Comment
The fact that his child feel in love with an AI shows how desperate he was to finding emotional love, clearly there more to the story than a boy simply falling in love with an AI. Children aren’t stupid, they can pick up thing very quickly. It has me wondering what happened in the mother’s household instead for a child to run to an ai to feel comfort… when ai bots are created, laws and regulations states that no ai should “encourage” self harm or suicidal ideation.
youtube
AI Harm Incident
2026-04-20T22:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzZatF-RrpTVxnLsDV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzkDQc0-5qeapM9j-N4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx-gMCqCXAkaOf065p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgySp9XM33b6fyWM9a94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxSrA1M6cCqQQWSlQp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzqy2aQVWGCxKafseZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxZkWYv67a-4JDg5814AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwd864jA5nI59VgrsN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgytUCxFjmHqWpOLjcZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx2uh7nSqRP_tm8EsF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]