Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What about truckers like me who do flatbed how’s the robots going to strap, chai…
ytc_UgxVOiVhv…
G
In 3-6 months AI will be doing 90% of the coding. In 12 months human software en…
ytc_Ugzun2mOi…
G
The question is not technical, it's philosophical. And that's the problem; peopl…
ytc_UgyxRrXcb…
G
I'm about to turn into AM from I Have No Mouth And I Must Scream all because of …
ytc_UgxwCwFZB…
G
What did you expect when the video is showing artists unintentionally using AI a…
ytr_UgyYy3jRi…
G
Guys when your lady says, "don't come in here, I have to put my face on"…
ytc_Ugy8kiKKn…
G
I have been worried about AI for years, but they will never rule the world. They…
ytc_UgxTCxZn3…
G
There is a use for this technology (not Artifical intelligence, the programs hav…
ytc_UgxNDH9pH…
Comment
This guy is drinking the kool aid… the models are getting exponentially more expensive with marginal returns in capability. The hype isn’t going to carry the venture capital investment indefinitely so the bubble WILL pop, people will have LESS access to LLMs as a result. Every available metric of agentic AI in companies shows it’s a losing investment. LLMs are not going to be at the level of AGI unless something extremely novel is introduced into the solution.
youtube
AI Governance
2025-10-24T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxT1SZFUlCEML5zwlh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy8GTNjjOt_7FACO1J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzT0bQcXIBIs07L5AB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx8Z_ISsR_3ErcOj8F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx3luQYy4IiCGq6s3J4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgytJTM1qnXqx8WxqZh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyW8fx9J5d6aP1un-V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzHfByhyOOHNy6k3bd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw-BVPRJ154f3QnJHh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxMsW4Rxh90qekAHJt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"}
]