Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can totally see people falling in love with this once it has a much longer mem…
rdc_mfq2ce7
G
ai “artist” getting mad at artist is like an unemployed person getting mad at so…
ytc_Ugw1pp6gg…
G
My reasoning is : it is trained on human conversation and if it looks at similar…
ytc_Ugynectnf…
G
who could have thought that 1.2MP cameras and no radar would be a terrible idea.…
ytc_UgyK9IXSU…
G
If you put your art in the internet you put it into a dataset..... So computers …
ytc_Ugxupl7TJ…
G
Self conscious AI is a really bad thing. Even thinking about giving rights is wa…
ytc_UggsejZHp…
G
@adetheaters1111 I know your likely joking, but the AI still doesn’t have feelin…
ytr_UgxfuusbF…
G
There are two kinds of AI consumer civilian AI like OpenAI and Meta Gem, etc. An…
ytc_UgxFEQcON…
Comment
The AI says it has no prior conversation with this guy because it literally didn't. I think you're fundamentally misunderstanding how memory works for AI chatbots, they don't have memory like you and I do, they only remember details within their context window. In your case, its context window is only the conversation that YOU are having with it, it won't recall details from other conversations from other users because the version you're talking to literally doesn't have access to those conversations, there would be major privacy concerns and performance issues otherwise. Therefore, the version you're talking to will logically deny that it ever suggested anything to another user because as far as it remembers the only person it has ever talked to is you.
'I' and 'ChatGPT' are in fact two different entities. Think of it like clones of an original, a clone of the original ChatGPT told the guy about sodium bromide. The current clone you're talking to is not the same one. To the best of it's ability - even when it knows it is called ChatGPT - it cannot recall ever having that conversation so the ChatGPT the news source is referring to must be different somehow.
It's frankly quite logical and reasonable for it to say these things.
youtube
AI Harm Incident
2025-12-16T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzW0K1DUDQyaifgR854AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6C7jSlJ2hWrlgc7d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwL39VOUL7pcwIv3aR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyqs7hI1yQPVzE3b_B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxWzuJwxwrracLGkQF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzMlQPON4Gh8ZSqTgt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOnF9I85WcIIH754N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz28CvWiJdZxtb4V4J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyKkg_AZj14AJ3YYFh4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwwtdvQyEl5cCWo4Lt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]