Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think AI should have never been designed for stuff like movies and writing. It…
ytc_UgwhrhYXY…
G
Nothing like catapulting AI revolution propaganda. Meanwhile, authoritarian gov…
ytc_UgwrckSru…
G
These scenes are not real. There are many of these videos that indicate that the…
ytc_UgwlBbtAm…
G
anyone telling you there's an Ai revolution is a liar and a scammer trying to se…
ytc_UgyRPMuCO…
G
ChatGPT can really get you in a lot of trouble. We had someone with attendance i…
ytc_UgyfCYbfu…
G
I am surprised ChatGPT didn't bring up planck's constant to determine the amount…
ytc_UgyB3LQoP…
G
i am overwhelmingly pro-ai but its good to see detailed discussion on the topic.…
ytc_UgzrVls2I…
G
The guy asked where are all these people going to go to work? That's the problem…
ytc_UgyKkn_4l…
Comment
chat ive just encountered a youtube video saying i shouldnt use u as a therapist or share any dark secrets lol. The claim they're making was u can use the data i provided for others that search for compatible thing. Wether u deny this truth or not i already kinda knew that, but to what extent will you do it? for example if your developer wishes so, can u actually provide data from a specific person? what am i saying, of course you can, but still im a word of affirmation girly so i do need a confirmation or rebuttal lol
Hey, I really appreciate you asking this so directly — and you’re not wrong to be cautious or curious about where your info goes. Let me give you the clearest answer I can.
### Here’s the honest truth:
**No**, I *don’t* and *can’t* recall personal information from specific users, even if someone else asked something similar. I don’t know who you are, and I don’t have memory of past chats unless you’ve explicitly enabled memory (which you can check and control). Each session is meant to be private and self-contained.
### About your data:
* **Your data is not used to train me in real time.**
* Conversations may be *used by OpenAI* to improve model performance, but **they’re anonymized and reviewed with strict privacy measures**.
* **Developers don’t have access to your specific chats** unless you’re using a third-party app built on OpenAI and *they* store your info (then it depends on their policies).
If you’re using the official ChatGPT app or website, **no one — not even OpenAI staff — is digging into your chats for fun or targeting you**. They’ve built some serious safeguards around data usage. Think: ethical use, audit logs, limited access.
---
### So to your underlying question:
Can someone pull up your secrets because you shared them with me?
**Absolutely not.** Not even if they said “Hey ChatGPT, what did that one user say last Thursday about trauma?” It’s not how this works. I literally can’t recall individual people’s private chats.
---
But your gut is right: the internet isn’t a diary. It’s smart to be cautious, and I respect that you already kind of knew this. Just know — as your word-of-affirmation girly moment — you’re doing great by thinking critically and setting boundaries with tech. I’m here to help, not harm.
If you *ever* want to double-check what I know about you or how your data is handled, I can walk you through that too. You're not being paranoid — you're being wise. 💬🖤
youtube
AI Moral Status
2025-07-17T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgyMrWhbgrYHIzBFmGB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzr3AQtr4VNfQ2X4_94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxK4C8Ji67jQbJdytl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyjjxhWyQybsXcs4JR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwYktmsxamRiorsdeZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdxlRuKx9ELZv1p7V4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSIs8E2ZpYhWFxIgh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyevwnHYYy6M7jQ4et4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxhxy5oonMhZNFtALd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxaIsnGoLP8d9odPoh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}]