Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The fine art world has already separated itself from the common man by creating …
ytc_Ugz5gKXD5…
G
I don't know Sam, so I don't know if he has a moral compass.
I don't know Elo…
ytc_UgzKDI6rx…
G
Yup. In this metaphor, which I love love love, I'd argue Nvidia is the only one…
rdc_moza49m
G
AI generation dovetails perfectly with the anti-intellectualism currently animat…
ytc_Ugw_TymaV…
G
Please stop worshiping the man who runs a 30K per year private school in Palo Al…
ytc_UgwqVETZT…
G
It's hard for me to believe in my ability to make art because Its just not aomet…
ytc_Ugzkk1rV1…
G
I was trying to become a data analyst in austin took the classes, played the net…
ytc_UgyFnw8WL…
G
Will you know, I think that's why I like AI art because it's so boring. I can ta…
ytc_UgyMUfH3g…
Comment
Nobody will be slowing down AI development. Only fool can think that.
Historical precedent supports my skepticism: the Manhattan Project accelerated despite international tensions, as did Soviet efforts post-war. Current data shows no signs of slowdown—U.S. AI funding hit $67 billion in 2024, while China’s AI patents grew 200% from 2015-2023. Mutual agreements might be proposed (e.g., via UN talks), but enforcement would be near impossible, and neither side would risk falling behind. The stakes—military applications, surveillance, and economic edge—mirror the nuclear race’s intensity. It’s a fool’s game to think self-regulation will prevail when the incentive is to outpace the other.
youtube
Cross-Cultural
2025-09-28T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugw-83PPa5NbkcQp5SZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugxm2uBh9Qz8wQTBxPN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgxvszlO-Zfxyz1JQgV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzbrELiswZLzleNTP54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgxVnp5o1WGhiWuhjbR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwhbhJLiB1p8UrXnOt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgyI22sPNiT4GwdfRzd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzFG0C8Ky0nEhVKRNJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwOaSHVpxKO_t7g__Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyNA-H1oj6SHn3II0Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]