Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Physical objects or physical machines will never be conscious, they just are. A…
ytc_UgxomaAGS…
G
The only way to keep up with A.I. is to become A.I. the question is... will ther…
ytc_UgwHV6QYS…
G
i still cant draw something more than a stick figure
AI art is not art in my ey…
ytr_UgwIA2NRH…
G
They're only more creative right now, in the future AI will be far more creative…
ytc_UgxhS4fdw…
G
Planning to train an AI model by feeding it other soulless AI art as a part of s…
ytc_UgweDg_TK…
G
The truth is countries will militarize AI and juts kill billions of innocent peo…
ytc_Ugxl-hMMz…
G
“Daughter, what is Character Ai?”- my father
And in that moment I have felt tru…
ytc_UgyPwzhnl…
G
@dp_9290 That guy hasn't heard of DaVinci laparoscopes, CellaVision and those ne…
ytr_UgwV5QwuI…
Comment
Why we expect AI to behave better than humans? At least quarter of humanity had engaged in some kind of unethical actions. Some humans engaged in really bad actions and it can't be avoided. AI ib this sense is as humane as we are
youtube
AI Harm Incident
2025-08-30T20:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugxd63-vWhhLxQ3R5gR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyapAp6v3cSJb3lWxl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxg8YCpYbS189NpfoJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyiaFx-r0pBS8dnvj14AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyrsrOfRbxEz2CMcQJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCRVdqG6o_WsQYZtR4AaABAg","responsibility":"researcher","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzcNlHs20UsJ06MpXd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzIhW_apoYMF8NANX54AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwEhPgmoMp91RlIIo14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyooTF2KL1o0zfESb94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]