Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
if you make AI it should not think for itself and not being able to take decisio…
ytc_UgxWXmn27…
G
True story : ChaGPT couldn't even figure out how to split a <div> in html after …
ytc_Ugxt_0kgK…
G
Yeah how did they program that? Most automated voice prompts sound so robotic, w…
ytr_UgwAR-x48…
G
If stolen AI art isn't taken into account soon, it could potentially make every …
ytc_UgwfUheqB…
G
Thank you man.
For telling us about this ai artist surprisingly I haven’t heard …
ytc_Ugw7Lr74g…
G
So we're not even gonna talk about the judge's court order that made OpenAI reta…
ytc_UgyDeL9YS…
G
It’s so awful it’s almost funny. His opinion on a lot of non-european weapons i…
ytr_UgyZF72sf…
G
Now this man might know much about AI but he knows nothing of the Bible how does…
ytc_Ugy_Pqzmn…
Comment
Preventing the creation of superintelligence will have costs, and _we must pay them._ If Dean Ball thought that his life and the lives of all of his loved ones was at stake in the next 5 years, he would not be making any of these points. He would have an existential crisis and then do everything he can to shut down AI development, _no matter the cost._ To do anything else would be deeply disturbed and not in line with the values of the vast majority of humanity.
The real crux here is that Dean is confused about the nature of the risk, and has epistemics that will prevent him from being significantly less confused until bad things happen in the world that are sufficient to dislodge his emotional state.
youtube
2025-11-20T22:4…
♥ 11
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxhev4NGxygLF8oZMF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDuXnyJgV4hXBoO4B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwJ_Utk815mESSL_xd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxf2ysfrcjwOnYW4F54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxmcFJw3kKLiERMxy54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxL0H9m8rS1m5QivgV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyFFyGyzTCs1cqrp2N4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwMyuR03RQrhVBnhxp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcR4GjDOFwp7z_kMd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzG-M5F4kw2zM21MZh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]