Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is exciting! With platforms like Fanvue, creators can really take advantage o…
ytc_UgxN-gf7V…
G
Honestly, I do not care. As a programmer, I am perfectly aware of how much AI su…
ytc_UgzTE3Xjl…
G
your distinctions don't matter, AI replacing artists is gana happen and that's G…
ytc_UgzoNGXnp…
G
There's a solution here
Treat AI and robots like people not property
If you trea…
ytc_UgwC7--0K…
G
What is happening is that everything is turning to shit. I am using AI and it ca…
ytc_UgztxbGNA…
G
Deepfake has been around for quite some time now with "tom cruise" romancjng "pa…
ytc_UgxPK9tSv…
G
I have total of less than ten chats with Ai. Going late into the game. I found m…
ytc_Ugz81sxA5…
G
We all need Gods love ❤️ & compassion Elon & give our time & love to others. Rep…
ytc_Ugy2zG7dN…
Comment
@Alverin Your description of a superintelligence is that of a wrathful god, the rest of us in the real world are discussing computer science. Yudkowsky is assigning terms like intelligence and natural selection pressure to a computer algorithm. That's why his ideas shouldn't be taken seriously.
Are there dangers with the modern AI tools? Absolutely. But those are _human_ dangers, created by human intervention, avarice and apathy. They are not AI problems. This mystical digital voodoo apocalypse that Yudkowsky propagates is science fiction dependent on machine sentience and human ignorance to a comically absurd degree. Which makes sense, he's a sci-fi writer. But just like in Harry Potter in The Methods of Rationality, Elizier is imagining magical systems to fix problems that don't actually exist outside of his tech future headcanon.
youtube
AI Governance
2025-10-16T10:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugz0v6HzYZMQayCzDdJ4AaABAg.AOIaMMKMUroAOIvBr1gKYu","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgyN5VHEBtpWUt9kowd4AaABAg.AOIaBfJkydrAOKzBOyYZpf","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyD0BTu0ysPX4hosyp4AaABAg.AOI_xX57625AOJ2dCpeVVg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyD0BTu0ysPX4hosyp4AaABAg.AOI_xX57625AOJVBWaPloW","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyD0BTu0ysPX4hosyp4AaABAg.AOI_xX57625AOJVt3o4xYQ","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyD0BTu0ysPX4hosyp4AaABAg.AOI_xX57625AOJXfDbzdNg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwgVNJgSLMJDLUBU8R4AaABAg.AOI_Yt0P5YPAOL99fCzfXo","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgynkSjGpEQy8-Kc8zh4AaABAg.AOI_-OEKYN6AOIgq3rD1ky","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz-4krbJQUYK77HCYJ4AaABAg.AOIZrU7CBykAOIfV8nIKFU","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugz-4krbJQUYK77HCYJ4AaABAg.AOIZrU7CBykAOIp8lcsbHv","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]