Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI systems are still finite number sets, but human brains are less subject t…
ytc_UgzxRCNf7…
G
75 million jobs lost due to AI in 2025, buy 2030, 40% of all jobs will be lost d…
ytc_Ugxx1H0MF…
G
It’s not AI. Your jobs are being outsourced overseas to take advantage of cheap…
ytc_UgxdJm69e…
G
How to avoid this police robot. step 1: flip it …
ytc_Ugw9liniA…
G
Am I the only one that doesn't use those character AI apps? Ngl I just feel like…
ytc_UgzlBbnJf…
G
> That’s beyond what LLMs “next word prediction”
Not really, it's all just '…
rdc_nm9qnne
G
I really hope some company will soon build a fully functioning human like compan…
ytc_UgypFsKh1…
G
That's the point. Its a 1% gaussian blur or something. Not enough for our eyes t…
ytr_Ugx8QX3NS…
Comment
It'll only be a problem if this hypothetic future AI considers it as such... We like imagining robots with emotions but don't even know what kind of emotions they will feel, or in what way. In their point of view ultimate happiness might be being launched into the sun, or just to smash bananas on the ground forever - we have to way of predicting that sort of stuff. Since they don't evolve at all, they're created, and we tend to theorize an AI truly becomes an AI when it can reprogram itself, the chances it will mimic us silly humans seems rather low. Any way, I don't think any creature that just became sentient would be content just sitting in a factory, a mine or a battlefield..
youtube
AI Moral Status
2019-08-27T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxwpEk9vKe9NUA-ZQp4AaABAg.904DGqcR7LL94bR4GSfotO","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwS-unypey-mQW8Tfh4AaABAg.9-O9fftlkQF94q4xyHR9h_","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UgzGEUJLGlkIL-s04Tt4AaABAg.8zUjiWrAmFM96sYJsdnVXU","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578zTmfp8-h83","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578za4jR2mq-X","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578za9LT-CDfV","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578zcxRScYP3P","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgxlLv0rMuNplN-nVHR4AaABAg.8zT_Xegkn6U8zTnEN94W_S","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxXyh-p6pDPGItPDER4AaABAg.8zRQsIFYrAU8zTwXqfTc5K","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_Ugy4JPgBw-FkT83RQRp4AaABAg.8z1OytC5V9C8z8PSKOgGak","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]