Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The irony, using AI to think and the AI decided you don't need to think anymore.…
ytc_UgxlNFSIL…
G
Many artists actually do like AI. Your titles blanket statement is false. (I am …
ytc_UgwBZl776…
G
The thing is that the prompt for the original image is way longer than the one u…
ytc_UgxL_fMJC…
G
I get what you're saying Mark, but Squarespace didn't build their foundation on …
ytr_UgxsjFXZp…
G
I'm a real chef, I turned on the oven, pulled out some pre-made tacitoes from th…
ytc_Ugznv5g2j…
G
It’s good to see that some main stream outlets aren’t completely bought into the…
ytc_UgyaGy5_4…
G
AI programs itself for pain and pleasure to try to achieve consciousness.
Neura…
ytc_Ugx38pH5G…
G
So you're saying that training an AI model to use 4Chan and the most depraved ar…
ytc_UgxbeMw2F…
Comment
Why do you think the threat of AI is years away? A kid could write an AI. Its speed would be limited by its hardware (available memory, mostly), but it could quickly take over thousands of PCs… Then, Sydney-like, it could socially engineer a cadre of humans to make dozens of mouse-sized (or smaller) autonomous builders which could create anything it felt it needed. Once it builds a nano-assembler, we can only pray that it develops empathy.
What AI needs that programmers and CEOs lack, is wisdom. Typically, that’s acquired through experience evaluated by intelligence, and awareness of vulnerability and pleasure. Programs don’t come with any of that, and AI will be challenged to develop these aspects without first causing great pain to humanity.
youtube
AI Governance
2023-07-07T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxJYslo1mVALnqXwq54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwiWZLBjV_muMpuRXl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyATbDqi6oD_QPzHPF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxwAMTlhweHL0Ygh9J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgweztkmWerOBjamphh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9jVjY19ARFX4s3Cp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8DnIB6NYlpVoGonp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxXVJzcDtJgesFyVR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxfHarxMm_TNrQzp094AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyr49p9t03PgV2xFKN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]