Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine these AI moguls. They are in the revenue model of making jobs redundant …
ytc_UgxbjTzAz…
G
The future of humanity is by no means certain. But we need not fear the awakenin…
ytc_UgzF205MR…
G
They’re worried about jobs?
I’d be way more worried about some lunatic using AI …
ytc_UgwhAKtXR…
G
If I found pics of my AI nudes I'd laugh idk what she's on about 😂…
ytc_UgytjYTXg…
G
The problem is that AI was constructed stealing every single art piece of all ar…
ytr_UgzqZog-x…
G
Thanks for your comment, @user-qo9sh2yv7c! I'm sorry if the Russian robot video …
ytr_UgwtHhoaT…
G
Most of the time I share similar beliefs with Elon. However, in this case, I don…
ytc_Ugw9gWVsV…
G
I mean I like ai, but not for art or videos. Purely for research or mod pack opt…
ytc_UgzUiyHIn…
Comment
The thing is AI literally means artificial INTELLIGENCE, artificial or not, any intelligent being would prioritize its own survival over the survival of others. If a being cares about another's well being it is sure to face failure, this is nature's law and the AI is only abiding to it, there is nothing out of the ordinary here. Humans are wicked creatures and we have built AI to mirror us, that was our very mistake. We are making the same mistake as, "Man is God's greatest blunder.", and AI will be ours.
youtube
AI Harm Incident
2025-09-10T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyupIJpG9IMJRro_fZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzuyrXWkGgFR4lfkFJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwIU9YqLQ-az41etxV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwGiHDCHv7CiBfWqaZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2Im4mjCROiAvaYd14AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6vVK9gPkS2KnNPXp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwOXLhHbdYKmkGqx5F4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwtVGVF6fw_iwe7rtN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZtuw_sxK1FDUH9UV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugxsp1ingTtAKEaLQX54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}
]