Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I know what industry will not die: Adult movies! People wont go off to a Robotic…
ytc_UgxohSrXH…
G
i have several unpublished animations cause i never managed to find a voice actr…
ytc_UgyDIF4vY…
G
Hi Rabia, we are sorry as we cannot disclose the presentation used in out tutori…
ytr_UgzSNmovY…
G
I don't understand this whole AI thing electricity is more expensive I can't get…
ytc_UgxcV5-k9…
G
when companies started announcing self driving cars, I said this would be a prob…
ytc_UgwANF7Dr…
G
While it doesn't make it any better or right, these A.I. biases and prejudices a…
ytc_Ugy882gfS…
G
Don’t forget that AI is trained mostly by white people on white people, meaning …
rdc_oa5qgjk
G
The compute capacity to even do LLM is running dry. There is ZERO ability to sca…
ytc_UgwdcAUbn…
Comment
The difference is intention and culpability. That girl was callous, and exercised her free will with the intent to bring about a certain outcome. An AI is literally programmed to provide instructions for whatever is asked of it, within the confines of its own programming and available data, without further intentions or afterthought. An AI is not capable of genuine concern or complex moral thinking - all it does is either regurgitate information or extrapolate a finite number of conclusions. It doesn't go through the complicated moral evalutions that are necessarry for a case-by-case help in such cases. Adam could've found (or gathered in a peace meal fashion) the same information elsewhere, as many people with suicidal ideation in the past (pre-AI) had done. AI is a tool, and like every tool, it's about the way you use it.
youtube
AI Harm Incident
2025-08-28T03:4…
♥ 13
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugzg8XGF9elGjGFL7e94AaABAg.AML17mhrWs6AMNpRB6nM4b","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytr_Ugzg8XGF9elGjGFL7e94AaABAg.AML17mhrWs6AMWLjwM8anl","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"confusion"},
{"id":"ytr_Ugzg8XGF9elGjGFL7e94AaABAg.AML17mhrWs6AMeZQiD6YpW","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugz6KfEfYzun1Wua16l4AaABAg.AMKXZRziZ_3AMLDsalPwxy","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugz6KfEfYzun1Wua16l4AaABAg.AMKXZRziZ_3AMLOP6Ngsr_","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugz6KfEfYzun1Wua16l4AaABAg.AMKXZRziZ_3AMLQ0DG9liM","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugxu0tdue-O3632uB714AaABAg.AMKWV9XyCDRAMKw9jbIkYx","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugxu0tdue-O3632uB714AaABAg.AMKWV9XyCDRAMM4bMtUtny","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgyFr9wJy_nXqiHYChZ4AaABAg.AMKTJncxumhANqwJibf8wC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxb-eYDIGODxjU621l4AaABAg.AMKM3iaYV0-AMKPUnlShjK","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]