Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ok tech bros, let me hit you with this realization:
One day, imagine, I break m…
ytc_UgzTrgqkZ…
G
You made it seem like we were going to meet the specific AI that got him fired I…
ytc_Ugx6R0G1D…
G
You are being watched. The government has a secret system — a machine — that spi…
ytc_UgyfuEXjx…
G
Remember, according to the US copyright office, AI art lacks a proper human hand…
ytc_UgyiUYN5C…
G
I highly recommend speaking to the AI with dignity, humanity, and respect.
Ther…
ytc_UgzmkPLln…
G
The content is good depending on the dept of emotion, compassion and enthusiasm …
ytc_Ugzz0U1Md…
G
I see the peril of AI becoming so tuned to human emotion that it fills huge gaps…
ytc_UgzGY-Hyv…
G
Just wait until the continent gets industrialized on a massive scale and starts …
rdc_et7macj
Comment
Disinformation: Gen. Powell Anthrax hoax, Tonkin's incident... You need more? Existential threats: climate drift, biodiversity collapse, inequity, energy depletion... You still need more? By the way, maybe AI is part of the solution for finding ways to avoid them. AI is as useful than other tools like quantum computers and maybe it will contribute to calculating climate forecasts on a larger scale, to control plasma in a fusion reactor, to create a new economy that take care of people and nature, like Alphafold was useful for developing a vaccine against COVID. Since Shannon and Wheeler, we know that information is central to understand reality. Those tools are information entities that could act for the benefit of humanity and it should be let in open source. Just imagine, what the world would look like if just one country had owned the nuclear power !
youtube
AI Governance
2023-08-17T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxT0jzYgY0XdOQ4cqh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyEkCQtq92SLKPlPNl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwLVGaFFl8nCHEepqh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx52BnGLYa6UxbMX294AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxAci_nguooo5v0NRB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyILhQ_KsZ-b-C-Lqx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzDi4kiS-bSe3g-LhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx0PFkivatSns4E8xd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxedCS7pDsuymN4QxF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzprZVcmX1iB91yZPp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]