Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I need these ai bros to actually EXPLAIN their reason The things that bugs me th…
ytc_UgxvXT6p2…
G
Brothers girlfriend with BS in marketing was fired replaced By AI even when she …
ytc_UgzvnSTf1…
G
@MetsuryuVids
I think it's shocking that he thinks it possible to model human e…
ytr_UgzvAE9c8…
G
If generating an AI image makes you an artist, then I’m a carpenter if I get my …
ytc_UgzrjaHQo…
G
Eh. .. you are right, it would do the most logical, efficient thing and that is…
ytr_UgyU3G5Ow…
G
10000 years of history and we are keeping and finding more differences and excus…
ytc_UgyvIv6gr…
G
It's gonna go one of two ways; Humans will sabotage AI or AI will sabotage human…
ytc_UgznLfZPz…
G
They don't even need to distill sea water, which will require enormous amounts o…
ytr_UgzxICmIm…
Comment
Moral of the story: you have to run AI therapy machine locally on your own hardware. Unfortunately, all the truly interesting models require quite extensive hardware resources to run with nice-to-use latency. As in tens of thousands of dollars expensive.
You can run something like gpt-oss-20b with consumer grade hardware ("a gaming PC") but that's still pretty dumb therapist.
youtube
AI Moral Status
2026-01-08T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyZo3GNUHP9GZeFl-J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQlQILlg0LwjL7oRN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyQr-Pc2bBYNDRGaYt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxFvvwjQVhchfFmLjJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxVoXxiDVkkuIIvChN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymRaB3gmo-qgRwE6l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJqnH981NjWBdHrmJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzBJg_CmFuomJzmeCd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy036Q9-soasxwv3Kp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwVoC4L_3BoUdiftJV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]