Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I always talk to AI very politely like they're real humans for that exact same r…
ytc_UgxvoaGEt…
G
Something to build off of the joy of the process itself; art is often a challeng…
ytc_UgwQlRaD3…
G
this scenario is happening in the hi-tech industry, but it will take much more t…
ytc_Ugzt8D5YQ…
G
"We're going to be stealing AI art and tracing it"
Good. That's what we want. Th…
ytc_UgzR5OHj1…
G
I respect you for putting your face out there and presenting data publicly. I kn…
ytc_UgwNr8tvk…
G
I'm fine with humans inspired by other humans because human learning is much muc…
ytc_Ugxmyg2g2…
G
Bah ta question est la même que pourquoi utiliser une pelleteuse pour faire un t…
ytr_UgyeA7OJI…
G
Agreed this is truly not the way life should be going, a robot shouldn't be givi…
ytr_Ugyxn9rea…
Comment
It's really not a great idea.
I try not to judge too harshly, it's far too common for someone to lack access to actual mental health treatment. If ChatGPT or other LLMs are all they have... they'll do what they have to do. I'd rather they do that than end things.
That does, I think, impose more of an obligation on OpenAI and others to handle customer data and inputs much more carefully than they do. This is an obvious thing people would use it for in this country, "but we don't support this use" might fly legally, but morally they should do more.
youtube
AI Moral Status
2025-06-02T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwdACpl18m7KwoK3yV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzVQP_eNPXceliKYB14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw37Rhqs3hdltDmKbF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwUqB1p6yv31bXuJXx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzOw5HtJL8AQDWol9Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz5WdGnUMVQB251z1B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwdcvI6XCJMltXow8h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5Bprd_yOtYI5p_qh4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMAYfAzlkc-ShEmZV4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzlQTZFkPTcXyTIHJ14AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"approval"}]