Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Exactly ! I wish the RoboTaxi would have hit the brakes at an even higher speed…
ytr_UgzxciRVZ…
G
I initially though using AI would have been alright if it was solely used like a…
ytc_UgzYrgoKD…
G
IT IS NOT ONLY GIVING A BRAIN TO AN AI, BUT ALSO HAVE TO GIVE A HEART!…
ytc_UgwRCkhUC…
G
They sell you "AI fear" so you'll beg for the handcuffs. While billionaires infl…
ytc_UgzOT7wU7…
G
I hate AI almost as much as mass baby murdering.
It is the most retard thing eve…
ytc_Ugy0izYWq…
G
That's an interesting challenge! Sophia has some unique features, but we love th…
ytr_UgxKVRg2M…
G
People here are missing the point when big tech companies support this.
They ab…
rdc_f8om0qt
G
Get ready. It’s inevitable. Build your own communities for food shelter and proc…
ytc_UgwjzFyLI…
Comment
There's a 4th option that AGI isn't cost effective, which makes it impractical. Moore's law has been dead for many years now, with marginal improvements in lithography. We are at tippy top tulip mania with AI everything in the markets. There is an efficiency curve with LLMs that they're already hitting. Better AI at this point means more cost. AI is already competing directly with humans for material and energy costs. New AI also has to be trained, so theoretically, it will only be able to steal jobs for one generation until it hits equilibrium until companies realize that AI can't solve all of their problems, this holds true even if you have an AGI that can train other AI, *you still need a source of the (training) data*.
youtube
2024-12-31T05:4…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy7CiP_WRsuRD7ZY-94AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwurTSRlmR2kASMQvR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwq7ZioSzGKkmgp-ch4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxzEtFJWgnciAylnpJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwO2wPDMFUtfUDpuNJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyQDEwjiEy6DODa-Xl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyyPF-RIfFY81eXM4J4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw71YB5J9v6rPG7yf14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxMH_t2KennOBWJx2B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy1wN4o8KKQMhTzD7B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"ban","emotion":"mixed"}
]