Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's like some groups of elite got bored and let AI move haywire in the world an…
ytc_UgwEOGdX0…
G
And the funny thing is all these scientists and engineers are racing to advance …
ytc_UgyKiepod…
G
Big no on Yang. Some of the most terrifying words in English are, “I am from the…
ytc_UgwFwWinY…
G
If yiu use Ai to make character sure, that is for you, but thrn you try and sell…
ytc_UgwygLnHo…
G
I hope that no Karen robot comes to existence, if that happens its basically imp…
ytc_Ugys1h8J4…
G
"The article on Tesla's Autopilot seems to cherry-pick negative incidents withou…
ytc_Ugz3of2MQ…
G
I think the connection between AI bros and slugs is rather simple, actually. Slu…
ytc_UgzbDAVVp…
G
Yes we need to fix ourselves first. Although clearly well planned uses of AI can…
ytc_UgyQZm3rL…
Comment
Interesting video, but please stop talking about AIs as if they have intent and motivation, they. Do. Not. They are not thinking, they don't panic, they don't try to come up with excuses, they only appear to do so because that's how many humans in their dataset have so this is probably. None of this is thinking, and developers embracing that parlance are just perpetuating this myth that Sam Altman and his pals have made up to make their product sound incredibly smart. So tired of people describing AI's as "thinking", "feeling" or even just "responding" as if it's a real human.
youtube
AI Jobs
2026-03-17T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxACo_WD9ZGX1VTsWZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQ6mWi2J08MWUcOgR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwV5W29dp8fu6DIFDt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwRe1x94dZ__cXTdp54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwquvARtQAWgvvMXft4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzXenYsS71UMSK2ajB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw4DfnvDNSO0rbSDxJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyp5kZY-ycJ1fxh95x4AaABAg","responsibility":"media","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwZ6uZOPfOGIk56IK94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzkIL48ZbKzfsBoNed4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]