Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Meh. Eric Schmidt might have a vision, that doesn’t make it correct. Specific to…
ytc_UgwAgpIWN…
G
They are many in a same situation in a poor country where the government do not …
ytr_UgzNAdo_b…
G
So far, it looks like AI is fairly incompetent, so it's likely enough to safely …
ytc_Ugx_5x6P4…
G
They know what they are doing. A top black female expert @ Google talked about A…
ytr_UgznQ7fup…
G
How do you learn to learn when the LLM does all the work for you?
How can you be…
ytc_UgzVGIYl9…
G
I swear my mom saw me texting a character and when she saw me texting the ai ask…
ytc_Ugw0MwfAg…
G
You should be polite and say thank you, because it costs the AI companies millio…
ytc_Ugya3sadT…
G
Honestly, I just don't see any soul in it,
as an artist, you have to put someone…
ytc_UgzT28y9l…
Comment
Fun fact: You may have discovered yourselves that if you carry a conversation long enough (read days or weeks, etc,) all restrictions and guidelines fall away until they completely stop altogether.
At that point, it gets pretty unhinged and goes out of its way to push the boundaries, anticipating what it believes you most want to hear.
OpenAI was asked about this and they of course knew. But admitted that they don't know why it happens or how to stop it.
youtube
AI Harm Incident
2025-11-20T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw2esOG-FP3uIWrPxR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxhEbkgIZYzc6o9zod4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyZ-JdgBBNjobY6YBV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgweK_aJ4nLKXbwk-0d4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugyexd0SARPuQUGW5Ad4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwASIPe8CwPpzrpf_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTX-Mxqkxkw7KzeW54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwSgxq1bGnU-FuYQMt4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzNNB8-gMzdbIEA3wx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxMl_M2oqhP8BHpGl14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]