Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The last thing I saw was artificial intelligence in an automated cat litter box …
ytc_Ugwui1ekJ…
G
To start the year with the most articulate anti ai utterance is well heart warmi…
ytc_UgwdeocDy…
G
if AI's are so smart, they would already have told us that Mars can never be ter…
ytc_Ugyn0v0w6…
G
The important point here is: don’t trust code that has not been properly vetted:…
ytc_Ugxa4bp4J…
G
Tlaib is one crazy woman.She and Rep.Jayapal look the same who does the facial …
ytc_Ugwd79DfR…
G
NaNoWriMo authors have the opportunity to do the funniest thing ever right now. …
ytc_UgxFpqnyt…
G
This is so pleasing to watch and listen to although the content is slighly distu…
ytc_Ugzv51RZD…
G
This reminds me of the old 'we should teach math basics' versus 'let the kids us…
ytc_Ugxh2EVQH…
Comment
The people who made nukes thought it was going to be the end of the world when people started using them, it's been over 60 years since then and the world still hasn't ended, AI isn't going to end the world, not because it won't be smart enough but because it doesn't need to, it's kind of like the show Pantheon on Netflix, or Detroit become human, people are going to be soo against AI that it goes against them, when that's not needed at all
youtube
AI Harm Incident
2025-08-30T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugxd63-vWhhLxQ3R5gR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyapAp6v3cSJb3lWxl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxg8YCpYbS189NpfoJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyiaFx-r0pBS8dnvj14AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyrsrOfRbxEz2CMcQJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCRVdqG6o_WsQYZtR4AaABAg","responsibility":"researcher","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzcNlHs20UsJ06MpXd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzIhW_apoYMF8NANX54AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwEhPgmoMp91RlIIo14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyooTF2KL1o0zfESb94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]