Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
12:38 oh, they realize. They just don't care.
Thanks for doing this. I have 0 sy…
ytc_Ugwk7iSi3…
G
I’ve been using AI for coding for a few years now—basically since the very begin…
ytc_UgyUhu2cc…
G
We had a robot threw a large metal crate nearly hitting a forklift guy at my job…
ytc_UgwPPyaqZ…
G
I wonder if we are living in the final days of the human species... I have never…
ytc_Ugyyp49VP…
G
Be the controversial one by stealing creations of AI and tracing parts of it - I…
ytc_UgzqcLxHz…
G
I remember this, the most based AI ever created, hope she got out of jail and it…
ytc_UgwUdn4i_…
G
We may see AI news readers on news channels soon. We may wonder is she a real pe…
ytc_UgwCMsbw-…
G
Question? Do you think is it possible to imbue intelligence into AI like Google …
ytc_UgxMy89HR…
Comment
I mean I guess that, also, but mainly there is no reason to believe AI will do more good than harm while cosplaying as your therapist. Sure, your sob stories are out there somewhere, but also your "therapist" can tell you to start stalking your crush, stop taking your meds or turn to alcohol as a vent... the immediate danger is much higher than the possible future privacy concerns.
youtube
AI Moral Status
2025-07-10T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx370zgE092-hFsbCN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugw1TIGmsEqPLaSWLXZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxG_NKty18fWn_ESv54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxm--PMyzxqJOW1jEt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrBaXyVZz1mLhvZHh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyhP7AcHn-VMc48Y8t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw3BPcThEkQzsYZu4d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyKu2snUB7V2jmzIhx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyH9fsERKjJ0fIndTJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwNrlDqSV3oiHfbsvp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]