Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Fellow autistic artist here! And, I have several friends who are also autistic a…
ytr_Ugz_i0eRR…
G
To those interested in knowing both sides of the general AI argument(not art spe…
ytc_Ugx99v8T6…
G
4:25 this is where he stops making sense.
"There's almost no limit to how much …
ytc_UgyK7N25Q…
G
But to be honset, for that you need actual geniusses in what they do. Nobody rem…
ytr_UgzpFFXRa…
G
So true, it just that males have a few more ribs he can have to... give us more …
ytr_UgwMIaLb5…
G
The Truth is to have a Fully Autonomous Vehicle that can handle almost any Weath…
ytc_Ugxuf29DM…
G
Next time you’ll a AI round table discussion with the AI optimists, invite this …
ytc_Ugw4rfxwc…
G
I like how these guys are pretending to be worried about AI scheming as though i…
ytc_UgzV2n-0r…
Comment
AI is the ultimate sunk cost fallacy. They've invested just enough to make it appeal to corporations, and now the corporations are realizing that it doesnt measure up in the slightest to what they were told.
Their response isn't to cut their losses, to say "yeah, this AI thing isn't working how we wanted it to", but instead to double down on the problem, and throw billions into an industry that by it's nature cannot make a profit. They *CANT* give up now, because they've invested billions, and *CANT* not invest more, because if they did it's the same as if they gave up on the whole thing anyway. They will keep throwing money until either AI succeeds in doing basic human tasks, or poisoned data kills it.
youtube
2026-03-03T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwkQQlvc8QTpmp5F_94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgySWL-KNi_0Ny376rt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy3NdkpAoaLPgwu-FZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugybw41gSnTeI5h2eo54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx00BNZo1Weat9dBIh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBpenOT8VBNSlj3KF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxQASaMT-HzEoKGVjJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxOOtMBfPm6iMfGoEJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyoA_nEq7MJXQpNOPp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxvj-sw4NVXI7gBJiN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}]