Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Software developers will move to developing agentic systems. Rather than doing t…
ytc_Ugw4GMasQ…
G
Wait so true I’m gonna go borrow some ai slop someone generated. Remember gang, …
ytc_UgzDAhMAu…
G
Max, that's a crap. Nobel prize was awarded to the scientist's who for years and…
ytc_UgwoztqVU…
G
Given that humanity can't even agree on what consciousness is nor come up with s…
ytc_Ugwwph1Dv…
G
Disagree with that framing, because it suggests that the lawyers in this case ar…
rdc_n9hftrt
G
And when anthropic does similar shit youre gonna go to google right? And then ba…
rdc_o7xyiaf
G
I swear to goodness ai stole some of my groundbreaking personal research. It sou…
ytc_UgwnHjeMS…
G
I hate to risk siding with apologism, but tbf it's not a whole lot different tha…
rdc_f78zjol
Comment
While AI and technology in general has helped us in many ways, in many ways it has also made us less resourceful. Across history, humanity had to think of small ways to assist itself in daily tasks (toothbrush, car, vacuum), but AI is promising to replace what human beings have done. Aside from care aids, I don’t see why we need AGI and agents?
youtube
2025-05-30T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | ban |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz9RVSE4KnzmGEOoBB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdoCbYWBIKCPLMNJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwrWLSqX9iONMSIFC54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgypPjsLLM3_UTey1154AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"resignation"},
{"id":"ytc_Ugx0EbsDQUT5D7qQ3J54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwals5RhybIHB49Ogx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxurhrF5FdBT3Yx6o94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzhr02VUOGmaWZ51md4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxEOWYjkzgq-9R9894AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzfl2_S4KdK3Gry-th4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"}
]