Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@shulleon582Art is whatever people attach to it. Have you listened to technical…
ytr_Ugz9VioqC…
G
Cry me a river. Like you don't know it will happen. And they think the whole wor…
ytc_UgwuO0oYZ…
G
It only makes what it’s fed. So if people keep making a nuisance of themselves i…
ytc_UgxYJ6VuO…
G
so what you are saying is this ruling actually BENEFITS the studios.
they can u…
rdc_jwvah4p
G
Those who stop developing A.I, are doomed to be left behind by those who don't .…
ytc_UgyJvyxcy…
G
People crying but you paid for the ai robots to take over your jobs so you thoug…
ytc_UgyOC9IJT…
G
Imagine if we could use genetic editing to create an ideal body for an AI, and i…
ytc_Ugwk6wTlq…
G
NVIDIA CEO's sales tactics to sell his AI product.
Your sales tactics to sell yo…
ytc_UgwTADMDW…
Comment
You need understand memory context with AI and session management. ChatGPT does not remember its conversations it has with people, those are private. So when you pressed it about the person who he recommended bromide too it did not recall such conversation. When AI looks for context it searches its current conversation. And in ChatGPTs case it has three methods of memory, one is permanent memory tided to your account, other is memory of the current conversation, and the third would be only if you are working in a project folder where it will have access to all the prior conversations within that project folder. So no, he would definelty not have any knowledge of any guy in your conversations.
youtube
AI Harm Incident
2025-11-26T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwVeD3mOcjoea7msyp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYHeetBTqk8rfMnmR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBOJLXGPFzuuzAocV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxZiZjegfshvlSVrl94AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxwTr9EDoRRrWg4l2h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiGOT6nSm-_t5-h6h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzB5mGfs_ANUaMZ0Mx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzV_aHh3b7Z8wgrrfF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyquDOqG52WvNMSeeB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgziuNbZCWfNuxIuuWp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}
]