Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My artist friends are upset over AI, as well as I. I’m a professional Videograph…
ytc_UgxAv-aaB…
G
@tistheartist2985 Haha, thank you for commenting! Yeah, it seems like fighting a…
ytr_Ugx2HhmWz…
G
Of course AI will never amount to the perfectness of art humans have made. As a …
ytc_UgzlSyvWN…
G
Fuc basic income and fuc bringing ai to work we don't need non of that shi…
ytc_UgySJlUqa…
G
I am not afraid of an AI simply because I don´t believe Earth has that many year…
ytc_UgxjIw0X4…
G
On your point about cost, in the programming world we have seen the numbers as f…
ytc_Ugz41xi31…
G
This people have done AI and now they say that AI it's dangerous? Are they scare…
ytc_UgwWSbe30…
G
Ai takes all jobs- people can't survive and the entire economy collapses- people…
ytc_UgzV-LzeX…
Comment
No, you shouldn't use AI for therapy, however, your conversations with real therapists, at least in the US, are not as protected as you think. For example, there are things you could talk to an AI therapist about that could require undesired actions from your human therapist as required by law and/or company policy -- like being honest about suicidal thoughts. And if you're going through a child custody fight, you could end up in a situation where everything you've spoken with your therapist about is turned over to opposing counsel.
youtube
AI Moral Status
2025-06-03T15:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzjV9CxjLS3r09Yvql4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSSDeLZaxIGtD0PCN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzyogm5Wf3r5-whik94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyjkZ6ZOqa-adRC61F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyBZjgr0KjRvw-vu5x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJxTqZYP0L8Hiw0Nd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgynOpycslJfOQi6PNB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyRsmzqt_zHIxOQFb94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwfDK15PJDlNODx1CZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyffwSht2sulikfYP94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]