Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i just imagine this as a an intro to a sci fi movie where humanity is wiped out …
ytc_Ugzj3onf0…
G
They should make a sign a law about never ever create AI Human killers or any th…
ytc_UgyhnNP6N…
G
Just remember Ai can never make Art. Because what art means is something made by…
ytc_UgxgFjV6K…
G
Is it only dangerous for people that lives in western hemisphere or the countrie…
ytc_UgyOH9gjx…
G
Open letter to “Tech Bros”, about AI
Thanks Tech Bros for making a toy that pro…
ytc_Ugz5cc5GA…
G
Not to mention the huge amount of energy AI uses up. Our excess consumption is h…
ytc_Ugzx4pGVc…
G
I can see when the first robot asks, "What is the meaning of life?" and be shock…
ytc_UgjpW_cqq…
G
>a severe revulsion for AI content of any kind
It's currently a marker for w…
rdc_my5wayi
Comment
The mechanical side of developing AI, even if you’ve invented or discovered the back propagation, doesn’t give you the right qualifications to “warn” of the so called AI’s existential threats. When I hear the details of these pioneers say about the threats and risks, I find empty, flimsy, incoherent arguments, and it’s not surprising at all. I remember somewhere I read when Einstein praises a mathematician but at the same time tells him he didn’t know much about physics, this is exactly similar, you Mr surely know about AI mechanics, but you don’t know much about existential threats 😂
youtube
AI Governance
2025-06-16T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy9wJpIfqVxjuJLdpd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCQVwMCR_VGn7jVnx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgybXkr4X215jtygtad4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxHS5JL5bVyWUYYtTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzQmhAR2CwB4kYeHMB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLIEW7gr7eyd0JeZh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxPOJnOPfErwX-Bzjx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxZigj30JL_bNncSXx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBCU5hjiAHvy9U3v14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy8WrEesqT3sjERXBd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"resignation"}
]