Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lol bullshiit there ai Elon you don't factor the automatic we don't think about …
ytc_UgyknuBmd…
G
The issue though is AI in its current llm forms Is not profitable. So it really …
ytc_Ugwa9sHAi…
G
Well…, Cankles McTacotits done settled the problem for them! Thursday he signed…
ytc_UgxFjINs1…
G
@Oliver_Goicov Perhaps, but I don't know if we should default to hobbyist econom…
ytr_UgzS8Gzg2…
G
Damn right, we should be concerned about it. The pros are miraculous (curing can…
ytc_UgwJCEmZT…
G
Your behavior trains recommendation AIs
Every time someone:
Watches a YouTube v…
ytc_UgwuoDkTj…
G
And if its raining hard and you can't weave? The failure is auto pilot. Its next…
ytc_UgztUyK3Q…
G
if ai was ever going to be useful, its ignorance like this situation will end it…
ytc_UgzutBMFQ…
Comment
The trouble is that when I listen to Zuck, Altman et al, then I can't escape the feeling that they've swallowed Roko's Basilisk whole with the degree of their core belief that hitting AGI is the most important stage in human development, as though you can somehow skip the "feed ourselves reliably" stage and that taking life-critical resources _away_ from humans is an acceptable strategy for achieving that goal. It's absolute madness, especially when you consider that LLMs for generative AI are an evolutionary dead-end.
youtube
Cross-Cultural
2025-12-20T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwq9IZ7dcfycp5sB254AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzFkmF1ZneX9Ki4Sgl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxmyHeQQm1aqYKht2l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2cxH5sbHjXDiL1EJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy74opW5ETMbM77c6h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxV8t5SNhpHksmVmMh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwCj6wi1IflLISbmd54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy4GIbEr8yMOGJ7k-V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwJYY7fyv4IbRgDKiR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwDImfl_vAonS9j3gJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]