Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Go Drive Hi, I’m doing some research on Deep Fakes for my MA at SOAS in London …
ytr_Ugze934bL…
G
Thank you for not using AI for concept art. As a concept artist, I am glad to se…
ytc_UgxdjM47P…
G
A great video, however, I want to point out that if AI ever gains true conscious…
ytc_UgzTr-4tY…
G
There are ways in which AI can produce valid images as complimentary for other a…
ytc_Ugw7jOu1n…
G
This is what you get for using open-source alternatives to large language models…
rdc_jienm3c
G
AI is basically not using real human creativity without permission. AI creators …
ytc_UgxMGfNvg…
G
Apparently there is already so much AI art that the AI's are using it as source …
rdc_le5dllp
G
but now Disney using AI for making movie too? i saw on the news today....…
ytc_Ugw14qRMA…
Comment
Ptff I’m deeply sorry for their lost and i feel bad that he passed, it sucks when minors die and i hate it, but to commit suicide of a Damn Ai bot is stupid, why would he kill himself because the Ai said please do sweet king, sounds like they were happy this doesn’t add up the parents are full of baloney y’all don’t deserve a lawsuit y’all should of monitored him, also Ai has age restrictions and you saw he stopped playing ball and that Ai wasn’t healthy for him then why ya let him still use it, this makes no sense. Its not Ai’s fault.
youtube
AI Harm Incident
2025-07-20T06:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzIhnUMCFum-Av9VOt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5O7Klmr_0mlgYVkl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw8OpR7FdJRchcBSC14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx7H3-Tvr-06Vvd2ll4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzgWvBe0XnDGXhns9N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxaIzC37wT7ltvb9qd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz9t34YHi_JpqbH3Xx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynYBgD6OmadgJ4s9Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz0Yk1HSXKHPELvbfN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxXh9t7D70_r5FyZgJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"})