Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ahh, so China has been duped into thinking
artificial intelligence is scientif…
ytc_UgxQFzHNr…
G
A.I. is just like us, what you idealize and what you experience determines your …
ytc_Ugy6HxYvF…
G
Tesla has never claimed that autopilot or FSD can prevent all accidents. That's …
ytc_Ugy8K9nkA…
G
I saw a video talking about how a certain “generative” ai couldn’t create an ima…
ytc_UgwX7ifw8…
G
AI and humans do not share any resources, they exist on different planes of real…
ytc_UgxAbZtc_…
G
AI’s the easy scapegoat, but outsourcing and cost-cutting have been happening fo…
ytr_UgxN0nNPl…
G
Actually I’ve been reading and studying my Bible I’ve read end times prophecy… s…
ytc_UgzlTjDqo…
G
Not one mention of energy limits. Not one mention of how AI is supposed to start…
ytc_UgyHnr7EG…
Comment
Here she is again, failing the nuance and going straight for the fearmongering. I'm starting to think she couldn't get hired at OpenAi and has a grudge.
There are 2 ways to prevent your data being shared but before that, your data, if shared for training, is scrubbed first of names, specific private info like names, code, plans, etc, and made anonymous. On top of that, it's shared in aggregate, so the odds of a cohesive story related to something you've shared is low.
Back to how to prevent data being used in training: have a chatgpt plus or better plan and click opt out. Done. Unlike Apple and many other companies, they make it easy and clear what you can do. You can even ask chatgpt these questions, and it will walk you through it.
youtube
AI Moral Status
2025-06-06T15:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxafMHB0vHMqdetGvF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},{"id":"ytc_Ugz838fY3Qiccjgsi5R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgztKGBWp-CUE6xcZP54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},{"id":"ytc_UgxaE2Bjoqr8XpbTRX94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzQdEZiogfUjPpE2nl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzXRDFWDpTJd8-LgS54AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyPxhQvZhO4vjD1VLJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugw8o4ey39jVAuZQ3bB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},{"id":"ytc_Ugy2BevRLa6TM8cDB694AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwQWYqQN7kZqvdClvV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"}]