Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is it possible for an Ai system to end up collecting data from the Matrix movies…
ytc_UgzhKaUNe…
G
When I read this:
> This facial recognition software would then capture sai…
rdc_e0w2oqf
G
At this point, cons are gonna have artists trying to open booths prove themselve…
ytc_UgzkqzrF1…
G
I think the problem isn't the AI using somebodys work, I think the problem is pe…
ytc_UgzwdjVdE…
G
AI expert: "AI is everything, I am AI." Sounds similar to Cyber expert: "Cyber i…
ytc_UgzW2KTYd…
G
now deviantart gets sued with a class action lawsuit!
look up Stable Diffusion L…
ytc_Ugy-Ude0O…
G
No Alexa. No Siri. I will not contribute to any more AI products or tools that s…
ytc_UgxUFmANB…
G
As always it’s not AI thats dangerous - it’s the people in control of it.
Human …
ytc_UgwFCUiJR…
Comment
The discussion is largely determined by the choice of words, right or wrong: Yudkowsky didn't object when the interviewer said that AI would be "programmed" (no, it is not; it's more like "bred").
The best analogy I can think of for the AI situation is this: It's as if chimpanzees in their lab were breeding humans ("Artificial Super Apes") and expecting to get the perfect servant.
youtube
AI Governance
2026-03-16T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwK1w_gnBmM6l7zPEx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwhNrVNBgQRvmYTrRx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwo1KEPla2iHpXaHCx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwpi6fJgid2WwTZrmp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwM22TwhUi3D7qd_oB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxnp09EoZmFliXHd1t4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzyg-hn1iH8xz9tnxF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwiMJ2hVYl-2AydnG14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyw7pTpKYqcGCBP7zZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyh05UE72bpm-0ugcB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]