Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thanks Bernie for a good speech and an important topic. It is very likely that t…
ytc_UgyHceP5R…
G
Honestly, I don't blame him. The biggest tech companies in the world all say AI …
ytr_Ugzt97SPO…
G
Founders of Open AI issued a warning. Yet people point out that only humans have…
ytc_UgycQu9Gv…
G
Ive said it for years but the hype for ai “art” is strongly contributed to by th…
ytc_UgzY1flDM…
G
AI is starting to tell us to stop, if that in itself isn’t a good reason to stop…
ytc_UgzwwQ6M9…
G
These guys are just running through the lowest AI same-old same-old arguments we…
ytr_UgxhOogp7…
G
So long as we have government agencies tasked with spying on and controlling cit…
rdc_fvyzdy7
G
Going off grid these days means complete isolation, no internet, no phones, no c…
ytc_UgzqzFmZ7…
Comment
13:48 The difference between AI and nuclear weapons, is that anyone in their basement can work on AI development. Not true for nuclear weapons.
Because open source is decentralized (Stable Diffusion is open source) and can be worked on by volunteers from anywhere in the world, it's an inevitability. There's no stopping it. No amount of regulation or bans will stop it.
youtube
2024-05-16T19:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyiOM3SZ_5p0gRaWUV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzUmxvQ17p8o2uJSMZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwGtJDDLXtgQPRqloZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyT7jrUo409g04jzpx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwRAj1Bh7LkXOL4HXB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyMKHZGxn8k2LI47Yd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzu519XUaVjsnIkwBZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgyD3g2tOz8-BWmK81t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwqYLA5C32NFfw9jER4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxO2ZlFk368oh0Pgep4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]