Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Musk's claim that in the future you will be able to make money when you aren't u…
ytc_Ugwj_g_cr…
G
Lmfaoooo imagine getting big off your boyfriend being famous then crying because…
ytc_UgxkctZBW…
G
As if people won’t boycott the use of AI? No one asked for it…there is something…
ytc_UgwEIjUC9…
G
Kids falling for DeviantArt's bait smh... You're not going to get rich selling A…
ytc_UgzISHNYy…
G
This is big lie, who is AI to decide who Jesus is?, this lady should go and slee…
ytc_UgwHBrOHO…
G
Many Ai systems are already fecking up
The Ai ordering at drive through a someo…
ytc_UgxIrzH4x…
G
Thank you for creating these amazing artworks. They look absolutely amazing. And…
ytc_Ugy_VhJE7…
G
If “blue blood” means dedication, effort, sweat and tears with +1000 hours over …
ytc_Ugyt0nm7R…
Comment
Joscha has the usual attitude of the dismissive types on this subject. "Empirical Evidence"... like all of the history of species on earth and how smarter, more powerful species regard (or don't) those beneath them? Or do you want an example of an AI trying to kill everyone? Surely an actually intelligent AI would have to have the ability to kill us before it would ever let on that it would do so. This is like the cop waiting for a criminal to point the gun at him before being sufficiently worried about the gun in the perp's hands. Sure it might work out or you could just die. I think what we need instead is a persuasive argument for why this *isn't dangerous* as every analogous example we have seems to indicate these kinds of scenarios are dangerous for the little guy in the scenario.
youtube
AI Governance
2024-12-30T23:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwN7eC2eFCvuISxmJZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9-IAqCcLRRmnNUNF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEyaYYGz5Pg1uSGDx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxG_7Xk_CwHCj-jYhd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMchnkeWy7cWNpfit4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw2D7nZo-bbaribvnZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx78KeYMJxxnksiIxh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw215QmMaZlGBRjHGp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzUjwGHK4_ortttaPl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyWos-0Hxq1uDEcDQl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]