Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is an age of regulation coming that will eclipse all other regulatory init…
ytc_Ugz3lrupt…
G
Hi, I am here from 8 years into the future - the first to beta test Google's new…
ytc_UgwjQ91C6…
G
"ai art is boring" how about AI ART AND GENERATIVE AI AS A WHOLE ACTUALLY FUCKIN…
ytc_UgzRMJIVy…
G
That last statement weirded me out. "asking for consent is a good policy for ev…
ytc_UgwhB6HTh…
G
Good lord those people who were specifically targeting your art and then respond…
ytc_UgwEnvN0_…
G
Hello there, I see your point and it is correct but you're making it to the extr…
ytc_UgwpRprUb…
G
They never got involved past Face Book support video Google plus remember, they …
ytc_Ugwfgrb4n…
G
to be fair, im fine with AI art but selling it is CRAZY. i understand paying mon…
ytc_UgyHPTvoX…
Comment
Quinton seems to think he has any damn clue how the human brain or LLMs work. He doesn't have an inside view on those things, because literally no one does.
Just a race of digital humans would already be doom. I never heard him explain why it is impossible for AI to be structured the same way as the human brain, with the same capabilities. If he believes that is possible, what he should be saying is: "Thank god we went with LLMs! We have to immediately shut down all other AI research! And we have to make sure that LLMs are never capable of contributing to AI research, or at some point they will discover a more powerful configuration which will violate all of my nice assumptions!"
youtube
2026-03-31T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy1EdtJyWvtvqqcGvd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy2jXt3QGvT9AoDaKl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwaPUTlcoFwUyX5Q5t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwUoM1-X-1LQAuSnTN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz5YNW1A-YgjUZbzLR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyU4DW73JbINRI0qW14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwS8qojC6vMABuDxk54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwQpTn1IqFlkpYN4RJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxuiZUl6eTwDiphuwJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzFaZVfFaQUyK61Qhp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]