Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI should never be used for things like this without human oversight. The main r…
ytc_UgzXGWrzd…
G
its really just a 50/50 gamble a flip of a coin. it is either going to destroy …
ytc_UgwdCYG2B…
G
This matches my thoughts and experiences pretty closely. I regularly use ChatGPT…
rdc_jiq1q4c
G
They dont seriously believe in the AI bringing about doom, otherwise they would …
ytc_UgyoS6Bpt…
G
How would the philosophy of mind fit in to aid the discussion of ethics of AI?…
rdc_de2p92s
G
I feel like there is a market for AI and digital art ... there needs to be set r…
ytc_UgyAResP6…
G
Eeeh what ? wasnt halucination basically nearly solved just recetly with that ne…
ytc_Ugwn3Eb4a…
G
bro there is some One that winning and ai losing is someone that making bellora …
ytc_Ugwr7TR4V…
Comment
Nah, don't worry about this. ChatGPT doesn't even know what your name is in a conversation until you tell it, so even if they use your conversation to train it later it isn't going to make a damn for the thing being able to associate your chats back to you unless ChatGPT manually goes in and applies your name from your profile to the training set and they have basically no reason to do that. The only real worry here is that they'll sell your conversations... and even then they almost certainly won't be selling them in any fashion that bundles your conversations in a way that links them to one another or otherwise makes them personally identifiable.
youtube
AI Moral Status
2024-12-17T02:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgzE2TCxmQoCUEvKjHx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyzdZkUhc-4iQP5rO54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugy4TYq-nNkfwxVuYdF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxWd9AX6N7KqvYuQ3p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzAcKJ5bC9jf9B0oNd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyQ81nhVPw69FsPijx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwoe2ZcEXCSas1bf8R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwyBa736hGSvN21g6F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxLKTyGB6Li_6KQWbl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgyzrtMc4SWXi5U0AWJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]