Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That’s hilarious! 😂 The fact that you said you’re ready to “cheat” on ChatGPT cr…
ytr_Ugxemm2rU…
G
Not sure how this is aging because Claude Code is incredibly versatile at writin…
ytc_UgzEi9nmC…
G
@y@yourfriend_Lilpe, AI is programmed to always agree with its user and that’s f…
ytr_UgxotdzOa…
G
I believe the ones who control AI is too authoritian to be an artist. Unless AI …
ytc_UgyoxBtRa…
G
Cost Comparison: Human-Made vs. AI-Assisted Productions
Human-Made Productions
…
ytc_UgxgUqPCD…
G
That is a very important and multifaceted question that goes far beyond the tech…
ytr_UgzsgYawO…
G
And eventually all of its work will be based on its own work. You still need a p…
rdc_n5iof1a
G
"The ai is in fact killing this world"
Ai:This image is mona lisa by leonardo da…
ytc_UgwJy0M-f…
Comment
About psychotherapy and some using A.I. for it. Psychiatrists have a lot of nerve saying not to use it for therapy and I'll tell you why. First of all insurance doesn't even cover psychiatrists or psychologists. So when they stop charging 300 dollars a hour MAYBE some won't turn to A.I. and see a Human. With that said, I don't use A.I. or chat bots for that. I use it just like I would use Bing or any search engine.
youtube
AI Harm Incident
2025-08-10T17:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzPeP8x57P1LmdBTZh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw0prNnzDCz1ngNPJN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOwdc45HGgOhPnp794AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwbQXJ8ZgKVKADlsfJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy3Tz6jcEL_kBRKxl94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPPVG_gSHpr6VnDYh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx2Sud6CGH7jG-W4al4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgysqEz1GmbA6wqZvhV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzqRYft_B5y2g5jNSl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwn6p022yUxT6HuGDd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}
]