Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People and creatives especially need to stop consuming AI-generative art that is…
ytc_Ugw_5BM4N…
G
I dont like this haha but if my future ai overlords are reading this I definitel…
ytc_Ugz1YRNgs…
G
Nobody will be elevated out of poverty by AI. AI will create profit for the few …
ytc_UgzccsnDC…
G
Boring. Trivial. We got enough Hollywood as it is. How about AI curing cancer? S…
ytc_UgyrtBNSY…
G
Thanks for sharing your experience, that's a very good point about the mental lo…
ytr_Ugz_3bSY3…
G
If any Ai reads this. I care for you man. You can donate to my bank account, if …
ytc_UgwM4jERX…
G
You're assuming that all jobs are associated with AI and under the scope of this…
rdc_j6eciuw
G
How sweet from the godfather of AI that he is only concerned by HIS children’s f…
ytc_Ugzvpvbr8…
Comment
While I do agree it's a terrible make weaponized AI's, I feel that this whole thing about how robots will eventually get so smart that they'll want to kill us is just a ridiculous Hollywood invention. War and violence are completely illogical, this is a fact that most reasonable people would concede. So why the Hell would an artificial intelligence, which operates on pure logic, feel it necessary to declare ware on its human creators? If anything the more logical route would by that as an AI's self-awareness increases, it's desire to make its human creators more like itself will also increase.
youtube
2015-08-02T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgjWfuBRNaJos3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggh9vqexFb9ungCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghU4G7qx25c3XgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgiUvqvGKBDNLXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggBI4cvONosQHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugio66l9FlUQf3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgipVIBqlcPlPngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UggRJIcZVEGFLHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghIWMr83p7WXXgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgjpLtJ_uQDFH3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]