Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
humans didn’t stop playing chess when computers past us by far in skill. That wa…
ytc_Ugxuail3K…
G
"The next mayor of New York"? He's still a candidate. Just sayin'...(unless you'…
ytc_Ugzw4HWzF…
G
Well this is saying A.I. has to take a humanoid form. What if we just give it a …
ytc_Ugwrzg2th…
G
GREATEST IS THE ONE WHO LIVES IN ME. JESUS. THAT THE ONE THAT IS IN THE WORLD. …
ytc_Ugy5tMoAr…
G
I feel like to be an ai artist you would have to have made the ai, and probably …
ytc_Ugzb4EIrr…
G
a unfiltered version of chatgpt just search that on brave search you will get wh…
ytc_Ugyv8yG8c…
G
Musk, Bezos, Zuckerberg, et al could provide for everyone now. Why do we think t…
ytc_UgwZmaVu7…
G
For the long term, I think that cosing will become more efficient and fast. That…
ytc_UgzjyTmld…
Comment
@halnineooo136
Yes, but planes crashed first, rules were made as a response. There are obvious things to regulate, but the whole operation is far too complex to regulate in advance.
AI will have to be the same, because we have only vague ideas about what can go wrong. The best we can do is to fail fast and often, while the potential damage is limited. And researchers test rogue AI in labs to reduce the risk further.
Once we have a clearer idea, we can start to regulate.
In case of AI, we don't have such obvious things as with planes. We already understood how machines work and break down, but AI is an entirely new thing. What theories we had were mostly proven wrong by the first iteration of ChatGPT.
youtube
AI Governance
2025-09-14T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugx-qzznYwo1reEFsad4AaABAg.AMA8-Pk0B_PAMSIEx5jYF7","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMAN33w51l2","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMCC2_IYVmC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMCR6jqsD9G","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMCWnuuATiY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugz5jNs56uezLOZZwbV4AaABAg.AMA1IuTRIZAAMJNat_ztFI","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzOrZwZQyjwQTMXmBh4AaABAg.AMA1-19-Qz_AMGbw0V_JDN","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugwim9XQC9rU_cnMzhN4AaABAg.AM9y2mqUfvFAMB9yJeDbRx","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyO6Ytj4-Ipljm9bO54AaABAg.AM9q5Q9W9e7AMSKFocRrSv","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyO6Ytj4-Ipljm9bO54AaABAg.AM9q5Q9W9e7AN3QkA82btd","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]