Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly some of these doomsday scenarios are becoming hilarious. “We’re all gon…
ytc_UgyM1K6Qz…
G
I don't think that the predicted mass unemployment that AI will create will be v…
ytc_UgwDxC733…
G
Automated trucks automated taxis oh yeah Johnny cab! Let me put it this way to y…
ytc_UgzeGrh2Z…
G
So we’re conveniently ignoring cheap Nigerian data annotators who worked in Open…
rdc_k33zsh1
G
I remember Putin’s speech…
The interesting part is he’s right there’s a new arm…
ytc_UgzGKNQzW…
G
I've never used AI. I just don't see the point when I can just google something …
ytc_Ugwa439mg…
G
@timspiker
He is definitely real and you can know Him.
His words are written d…
ytr_UgxZfDUGB…
G
I love it... The future is here and(in my opinion) this, with many other AI appl…
ytc_UgyFeHV0S…
Comment
5:00 I think that's the problem. I think the people that build it, are the problem. That he would try to console with something so...🤔... ridiculous, as though he were speaking to a group of 5 year olds...as though that's a fool proof ideal, manifested simply by stating it, organically believable, is stunning to me. Is so egotistical or, low IQ convincing. That someone in charge of something so wild to us, including those that are creating it, would lean on, "if it were dangerous, we wouldn't create it", as though that's how it goes. As though no innovator in history ever fucked up royally. WTAF? I definitely do not want individuals like him building AI.
youtube
AI Harm Incident
2025-07-29T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy9GwA-XUeWQgAxCWF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxvSsKPuc-AQ59ZEet4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzzRpm2pnwaQZdW4nV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxZekG0wBWOR4BEru54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzUQ9FyMVMLPkpSwVZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyi9ur1urBEiUhagPZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzJASvsmm36yPnBsUB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy0OpRNmX1t2nj8Kvt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxkG1p1otrC2UckYsl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyT6dd76S80pPQTz_d4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]