Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the problme with deviantart is that they already used every single art there, an…
ytc_Ugxcq6IgJ…
G
@lalat5899 hardly. That's a flat stretch with few curves in warm climate. Try …
ytr_Ugx9g87tw…
G
Amazon replacing their workers with AI is not surprising. If AI can replace the …
ytc_UgytIaKrL…
G
I can't do ai art unless I need inspiration of what I'm trying to make..…
ytc_UgzxtPyXE…
G
Once you set the ball rolling it is already too late !
It is ironic that the ot…
ytc_Ugx3bVlb3…
G
ChatGPT Deep Research:
Yes, it’s as absurd as it sounds. The official figure is…
ytc_UgzoIGivh…
G
There should be an a.i. tool for politics so we dont have to have endless debate…
ytc_UgwxvUWpj…
G
It actually is a way more accurate term than artists, drawers are used to store …
ytr_UgyzQGZpd…
Comment
I fully agree that we should not develop a general artificial intelligence or a super-intelligence that we can’t control; it’s better to develop sector-specific, closed AI systems.
In my opinion, you are drifting too far from reality by believing that everything is possible thanks to computers and information networks. The truth is that everything relies on a major vulnerability: all AI technological infrastructures are located on planet Earth and are powered by electricity.
It’s true that we might not be able to “shut anything down” on command, because AIs could become so efficient that they might find ways to stop us and ensure their own survival.
But natural events would be impossible to predict, prevent, or shield infrastructures from. A solar storm, for example—an unusually powerful one—could literally fry every electronic device in the world, from satellites to computers, causing a global blackout. Goodbye computers and AI; we’d be thrown back 200 years.
We have no historical data on extremely powerful solar storms, because we’ve only been using electricity-based technology for a little over a century. In the past, humanity wouldn’t even have noticed—at most, people would have seen a stronger-than-usual aurora borealis, with no major physical consequences.
We also don’t know the effects of a reversal of Earth’s magnetic field on electronic devices. Earth is 4 billion years old, with cycles so long that we cannot possibly know them, since our species has been around for about 100,000 years (and has been leaving written records for only about 10,000–20,000). That’s a ridiculously small fraction compared to Earth’s age.
Earth has undergone cyclical mass extinctions, some with unknown causes: meteorites? Volcanic eruptions? Extreme weather phenomena? Earthquakes? All events to which AI and its infrastructures are fragile and vulnerable.
What you’re suggesting would only be plausible if everything were located in a permanently stable, calm, and protected place—something impossible on Earth. That’s why I believe long-term predictions are unrealistic.
youtube
AI Governance
2025-11-25T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyYS6plwNwQ14U6aoB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyyIjmPmMuAw9vjkfF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzykSGMAMh-zmDt-Ex4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1w07PR1vc3QuIMgl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw4f_-nGLjD6d1NVJt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwbFbUX7NRzg1W4baR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyUVB1gIuquA_hU1KZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz7SWZ0JIkCYn6n8-h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzrNX20FtuPRfd7_h94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHfmoGZ3fQ00ZPXwl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]