Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
>If programmers who use AI are more productive enough that it reduces the tot…
rdc_kz0b8l2
G
Say what you want about the disappearance of human jobs. This might well happen …
ytc_UgyEyuDmF…
G
It is rare to find someone with the ambition to develop incredible things who do…
ytc_UgyjSTYom…
G
The trajectory of the industrial revolution will lead to automated everything. W…
ytc_UgxMFBxq5…
G
The new book, The Bubble That Broke The Bank, argues that the greatest paradigm …
ytc_UgzHup6FQ…
G
They told us on all of the cartoons. People thought it was just cartoons. It …
ytc_Ugys_JkZr…
G
also not to mention if ai ever takes jobs uh we can just protest and rebel then …
ytr_UgzSnbUE1…
G
That's what I'm saying. These tech "geniuses" are the ones creating this chaoti…
ytr_Ugx3fFnLi…
Comment
The irony of early on in the podcast talking about how people don’t admit to not knowing something, to talking about how no one actually understands LLMs, to taking percentage chances of catastrophe seriously. Neither Dario (Anthropic) nor Elon actually know any more than anyone else about chances of catastrophe. They are purely speculating, plain and simple. The correct answer to “what are the chances of AI catastrophe?” is: “I don’t know, and neither does anyone else”
youtube
AI Moral Status
2025-11-07T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzvBR3xx6FYttlkxYt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQSTTf7v2pVbA_xQV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2BPndcHlNRgzcuZl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxfn1OzBs-TIQjcWXJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzHTHA9O3SBJgyUy4J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0GDbnzyQM4Vdlb7x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy7OHLiO9WMY1At9Dp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzZhAxON4WPIXn7HDR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw_DQynDVHjJvl2m9t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgywzpC9VikcdWfMXuV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]