Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@erikengheim1106 Yes yes, you've suckled off of Shad's toes for hours on end, th…
ytr_UgwNwqxt-…
G
Stop faringite AI & start to pray for a way to use the power it gives you !…
ytc_Ugx5fa1In…
G
We’re in a simulation, but AI is a threat to life in 2030 and 2045? Make up your…
ytc_Ugw3rYVal…
G
What happens when these automated trucks have a computer malfunction and become …
ytc_UgyFgNxrr…
G
Once it is produced, it is practically impossible to prove how much is stolen fr…
ytc_Ugx40SQYW…
G
In my opinion human artists put their emotions in the art and ai can't make a pa…
ytc_Ugy8D-E2P…
G
If that’s true, and they didn’t do a replaceText on “OpenAI”, that would be even…
rdc_kcp2ti4
G
Here is my answer to all you "artists" complaining and bitching🖕
You were the on…
ytc_Ugx8E1fLV…
Comment
Alex is overcomplicating the "Assume billions of agents come online..." economics because he is basing it all on the underlying LLMs being (or - in the future - needing to be) self-aware models. Once models are truly self aware, well, sure, we debate their rights and hopefully, they do get rights. But for now, any "emergent behavior" you see is randomness mixed with human intervention and scaffolding.
youtube
2026-02-10T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx0D0BqJIoJtPN-bnR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzc-5WPzZ2MsuWxCwx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy66fS1HNyCAt1z7IJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzNDYe2N9T01_JR3nx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxW3CtlfcG03TqcNjl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxPB46nHqsrKhz9Exp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwz7iNk2pAlvbEqOH94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyDvYV5JmdWL8oLdJZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwWZqpHvcU6aCVM3zN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyDyfj2iqMlQclHBDd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]