Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai will not replace programmers lol you need to correctly create the logic from …
ytr_UgzfKh6K0…
G
This is basically the same thing running chat support. How often does the chat b…
ytc_Ugx1auMf-…
G
I get that Ai generated stuff is sometime favoured cause ppl dont wanna pay for …
ytc_UgzyT83pi…
G
AI still isn’t capable of doing many of the things that humans can do. A ChatGPT…
ytc_UgyoH8jH8…
G
What gives any human the right to say humans are actually conscious and not just…
ytc_UgxURFi6_…
G
So what I'm hearing is don't need to fix the AI we need to fix bias opinions and…
ytc_UgxtAHbq7…
G
And this problem goes deeper than just the internet. AI is now being used to wri…
ytc_UgxCAEIMu…
G
It's just such lazy reporting to use the term "AI". What makes one technology A…
ytc_Ugz1X3BOQ…
Comment
@d_trichDo you think the three most cited living computer scientists are pulling one over on you? Or that half of literally thousands of surveyed published AI researchers were lying when they gave a significant chance of human Extinction from AI? Or that the majority of leading AI experts were in a conspiracy when they said the same? That academics and industry insiders alike are somehow benefiting from saying that creating something much smarter than humans without being able to control it is immensely dangerous?
There are a few experts who think there is little to no chance that AI poses an existential risk to humanity, but without exception their arguments do not stand up to reasonable scrutiny. If you disagree and think that their arguments are compelling, fine. But you absolutely do not get to throw out AI existential risk as some kind of scam or delusion, when the majority of the field considers it to be a very real thing.
youtube
AI Governance
2025-12-07T03:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzrdzLzWdUu0SyAkG94AaABAg.AQ-iKiKKvG7AQ1242P8Ubo","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwnjB-9GKL-THzwuVx4AaABAg.AQ-hkKQMu2bAQ-maYjXzR4","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxPo0hIRTQ921Jnled4AaABAg.AQ-hOTln8GKAQ-ifM7KSU1","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgxPo0hIRTQ921Jnled4AaABAg.AQ-hOTln8GKAQ-lQex1xxT","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxPo0hIRTQ921Jnled4AaABAg.AQ-hOTln8GKAQ-pyWkj8l-","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgynTMM0QoDUhnl1uT54AaABAg.AQ-goW8Rr3LAQQ9ksJHJNe","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwXNLVUBTKgKhC_aSF4AaABAg.AQ-gAzOKp-MAQ0kt20IMMQ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwXNLVUBTKgKhC_aSF4AaABAg.AQ-gAzOKp-MAQ2zM2pE63B","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwXNLVUBTKgKhC_aSF4AaABAg.AQ-gAzOKp-MAQ4FThZzjeU","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwXNLVUBTKgKhC_aSF4AaABAg.AQ-gAzOKp-MAQ4Q1MMssyw","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]