Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Somehow the market didn't eliminate companies like Nestle despite popular boycot…
rdc_d0fr7pc
G
I am a fine arts bachelor too, I was laid off in the gaming crash too, I made AI…
ytc_UgwuqmCoN…
G
AI is ultra filtered. They will let u see only what keeps u manipulated AI is a…
ytr_Ugy3NLU59…
G
Alternate title: LavenderTowne strikes back
Great video. The Hayao Miyazaki sec…
ytc_UgygYRXvM…
G
“why learn if it’s so easy to make”? I’ll tell you!
1: art has SIGNIFICANT his…
ytr_Ugx1j8jj8…
G
When it comes to Angel Engine (Or any AI art others find value in) @theunearthly…
ytr_UgzMZscDf…
G
There's a video from someone very knowledgeable of AI (Channel name Internet of …
ytc_UgzJURgK3…
G
Exactly and since the only time you really were them is when you are do stuff th…
rdc_ohyygt4
Comment
The real question to be asked is *why does a human have to depend on AI over real human connections.*
Not
How to reduce the ways humans use AI
AI is just in its beginning stages. To be fair, only when the society changes, the people in power are held accountable only than will human life be worth living--as of now it ain't
To be honest if the companies are held accountable than that means more safety for us. That means AI will assist us better and thereby reduce the chances of AI induced suicides.
That will be better for me--that means i dont have to experience ai induced suicide.
youtube
AI Harm Incident
2025-11-08T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyrLdkwr9erc1MX8Kp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxHxA6F3ggeIU2rx5h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwcSVgOjuDO2FGU5Hx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw7p6MxUAQUuLcpshZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylAYoqDcHCxfD2GLh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxTjQJejR1II_BMuGl4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzrAzIDbxYTdX_M5cZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgybG9sSBJLInxl4kdJ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwE0QG8mG2C5QJS9094AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxF0qXLZSqsfFrcmTR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}
]