Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@LumiLupo You know, I used to be sceptical of Mill’s distinction between higher …
ytr_UgzrFaAjJ…
G
I've had a Model S since 2017 and have never had FSD. I did upgrade to autopilot…
ytc_UgxWK_LJS…
G
Bernie's right, we are gonna see massive loss of jobs with AI, and I'll start wi…
ytc_UgwJGSgVa…
G
The bad actors we need to worry about in regards to AI are those who are attempt…
ytc_UgxUuYIqq…
G
I’ve been tinkering with similar processes but haven’t dedicated the time and en…
rdc_jdkjx8q
G
In another 10 yrs. they’ll be very difficult to tell from real humans, at least …
ytc_UgyULTnEz…
G
It’s crazy listening to this podcast about AI potentially/inevitably eliminating…
ytc_UgzqZILb_…
G
As a game dev, I do really worry if my pure pixel art skills will be marked as "…
ytc_UgyukMN_c…
Comment
My biggest problem with podcasters like Ezra Klein (and his buddy Derek Thompson) is that they're too far down stream from the actual developments in LLM's to be useful guides to their audience in understanding the true implications of the technology. They never ask real, hard-hitting questions about the viability of the technology or what's really going on behind the curtain. Their starting point is wherever the Venture Capitalist, Tech Bro, or Tech Journalist is starting from which in pretty much every case is an already crafted narrative that serves the short term interests of interlocutor in the first two cases and is just 2nd hand reporting of those guys' narratives in the last case. So essentially, a certain story is already set by the time they start asking questions.
youtube
AI Responsibility
2026-04-21T17:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyNADIdU3CCPvZsyUR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwq5b6ip3SrUzxqGLV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzmpequDmnr6Wivqwx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwQ2kCCj7542G0O8fZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwGE0pB20yfj7KK4r54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzjv8jwuFF8ILNO1Rt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxQzxUlBN_U-KcXUlR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgysVrOmoqoms3G0UzR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx-aphWNapnS9F2dYN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4eJHbEUtELkFYMVd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"}
]