Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Vous n'avez pas vu les dernieres prouesses du robot optimus d'E Musk, moi aussi …
ytc_UgzFrLNT6…
G
I think there will be some sort of resistance. Everyone I know just about hates …
ytc_UgxaB1BQr…
G
Get a good Trade down longevity will be better than wasting money on college , w…
ytc_Ugz5VENrw…
G
I'm not entirely against AI Art, but the ethical and copyright concerns are legi…
ytc_UgxEB5voX…
G
CAN WE SAY, “THE RISE OF THE MACHINES”? CAN WE ALL SAY “THE TERMINATOR?”
THEY W…
ytc_Ugxx-VX5k…
G
AI is legalized theft of intellectual property on an unlimited scale. Every huma…
ytc_Ugzc67X5O…
G
Imagine the aviation industry saying: "Our new AI pilots will fly safely for 99 …
ytc_UgxgvfDpa…
G
Reporter asks why octogenarian politicians aren’t ready to regulate AI. Yes, the…
ytc_Ugwi8KGmE…
Comment
So...my guy needs to read more SF, he's clearly unaware of Iain Banks' Culture and other books featuring hyperintelligent AI. We DO speculate about these things, he just hasn't read it.
The fact he quotes Ray Kurzweil as an authority weakens his argument significantly - Kurzweil is...deeply unpersuasive.
There is a FAR MORE LIKELY outcome, which is that AI will get increasingly convincing but continue to fail in fundamental ways...and people will embrace it anyway because it *mostly works* for them. If the rate of AI failure can be reduced below human failure, that might be acceptable. But my understanding of current AI technology is that it is purely imitative with no initiative, which is a problem when competing with humans in most fields.
youtube
AI Governance
2025-09-10T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyxnlP1g7RutRjBpBp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwmsv1C98Zwl1NQ2xt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyqJECCeAjPwCYrMCF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyUAOUVX5LOH9MRUxd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"sadness"},
{"id":"ytc_UgxzVbZVPZ2rUN3EbfN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwbEZC7hRY7b2XZqqV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyc-Zt6vnV0pqLb0JJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwPnV-Re3hMaEVb-814AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy16-uFptPyMH9PHWx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxJtgeVoQtSW1VQ99x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]