Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I really hope you are right.
I think it's inevitable that we will be working al…
ytc_UgzF1DkIM…
G
Stupidity of AI compared to the brain from a neuroscientist I'd love to have a c…
ytc_Ugxlop4nn…
G
Might be a unpopular or weird opinion, but I look at AI as a Sword Cane. In one …
ytc_UgyKYTp10…
G
I think due to recent development of LLM, it is language model. it will replace …
rdc_nm8nznh
G
@kitsunismm@shadowlyy is literature not art in of itself? Is there not art in…
ytr_UgzFzHhR1…
G
So this guy knows that ’the algorithm’ is pushing him to be more and more into h…
ytc_Ugx_Ivqe7…
G
If AI MIX ALL THAT DATA TI WILL. And I don’t want to but it will. I’m a graphic …
ytr_Ugy9tjhKf…
G
No...
To be a medical assistant its not easy
If AI came into existence we all wi…
ytr_Ugzu6h1lc…
Comment
“But I didn’t use Open AI, so it’s fine”
No, you just used a application which almost certainly just used OpenAI’s API, so two tech companies have your deepest darkest secrets.
Edit: Yes, I am aware of self hosted models and my comment doesn’t relate to self hosting, it relates to third party companies.
Regardless of self hosting, LLM therapy is a terrible idea.
youtube
AI Moral Status
2024-08-30T13:1…
♥ 26251
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyUmVNE1wQCK7-yyPR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxwUbktSCocj45nqR94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwtd0tbQY-GQFYS61J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz6pB-M8AeaMMoKWWB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxqzx74AJdFkIfRmWB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzpUAU1utHPydnd7h94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKFHJxdSkiD1fO32x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyppLZJ5sC6xaeC5nd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxv38mtu4UzJCDkS5l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyRJnEg8d2mJ2pbiL94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]