Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
...welp, this guy managed to accomplish what literally no one else could: he ha…
ytc_UgwpHqqwB…
G
His last name is "O'Connor" and he's interrogating AI......this is scarily close…
ytc_UgzRGGsIl…
G
AI should be taxed 90% of "released salaries", and those money must go to former…
ytc_UgxXBcwzx…
G
Writing a book is a solitary endeavor, full of starts and stops that come at a s…
ytc_UgwG-hi7b…
G
There are a lot of subtle things in this clip that give away the fact that it is…
rdc_mupxi2b
G
You should read marxists literature about Capitalism,Socialism,and Communism.And…
ytc_UgzPqGL0G…
G
Physisist: "LLM is good at translations"
Translator "LLM is good at physics"
L…
ytc_UgxEuUPMg…
G
i hate NFTs. but i will say AI art is not art... yet.
you can bet your ass in 1…
ytc_UgwSw3X9w…
Comment
@PunmasterSTP They understand it.
It is a true fact that if you define _any_ remotely reasonable metric for credibility on AI and its risks, and rank-order everyone (or just a subset of experts) from most to least credible on that metric, the people at the top tend to be much more concerned about the risk of extinction from AI than the people in the middle and at the bottom.
That doesn't prove they are correct, but it does provide extraordinary evidence that this is worth taking seriously. Given that it is worth taking seriously, we must exercise appropriate caution.
Your only possible argument against this is that expertise doesn't exist, and that your ignorance is just as good as their knowledge.
youtube
AI Governance
2026-02-24T19:2…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzlL3VQZqpQHX0KRBN4AaABAg.ATc6MbGEhfUATcFkO-f8na","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzlL3VQZqpQHX0KRBN4AaABAg.ATc6MbGEhfUATcHIXqReRy","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzlL3VQZqpQHX0KRBN4AaABAg.ATc6MbGEhfUATd5xPDxgDs","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxpFXRD06y8kafYW1N4AaABAg.AKnlCiBQ7hcASmD7nFxgiB","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwdAYy5Ns-SzdHenYd4AaABAg.A60RqRl3sX7A6rah0nUAw8","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxFVviel1wJXs33FvJ4AaABAg.A0TSqrhPQ58A0U9kW6MVrm","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugy92tQLLIOAxDNVX594AaABAg.9rC8RuKMuxj9rCQ8kFNiPR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxVNcXiJJCbY5D1rgp4AaABAg.9d8vhH8q0Pu9d9HHOzgveN","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgyOljuJ60s6eyz-68l4AaABAg.9u-UhZzneex9vVAXmQ4YVE","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxlNJt3x6bxTbJmThZ4AaABAg.A3X3cI5t6t2A3r-r8sMQ83","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]