Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Great summary of the bs regarding AI or a better term technical intelligence, TI…
ytc_Ugzg6lzJ3…
G
Humans with eyes can drive cars, and so can computers. However, what if we gave …
ytr_UgxFssvEW…
G
In a room full of programmers, if there’s 15 and with AI here, we don’t really n…
ytc_Ugy7Dq3E9…
G
okay so I amma spin some yarn here based on my own anectodal evidence.
AI is fo…
ytc_UgyHjZiPi…
G
Your use cases are too obscure. The more commonly used the stack, the more accur…
rdc_mle6efo
G
I tend to be on the pessimistic side, but something in me, perhaps one of those …
ytc_UgyXPXDlR…
G
> is it basically just searching it's "database" for any mention of "here is …
rdc_m2e0rdk
G
No hate, but Roman didnt bring any real arguments about the danger of AI. He jus…
ytc_Ugz5PNBsA…
Comment
Interest in AI is mostly jingoistic. The work that goes into AI is impressive don't get me wrong, but people sell it as thins and that don't know a lot about AI. Getting really into AI would actually boring most people to tears. AI is dozens and dozens of imperfect choices, that we belief will lead to a statistically correct answer. We also know that some answers are mathematically impossible with that technology we have, and we know this because someone has done the math and proven we can't do it. AI will do impressive things, but there are going to be a lot of points where statistical accuracy is all you get, and that there are always going to be flaws.
youtube
Viral AI Reaction
2025-08-23T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxOExxfWwRM5FvpJUJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzSUP0Fbx0YCagck5Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzf9vFCXZvZJDnlV4R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxR2EORbFVyTsSpxV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4vgvUA8wuOMasRUd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyWwdY8X8wZvDIbIXt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzK_25XW9TRm7Y9v5F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxQNbA4_klO8pjcTPB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyyi0YgWvH_EWdw7wt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyH8okIbGqVW7P9bHB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]