Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI doesn't need to be sentient or think it is sentient, the madness of creating …
ytr_Ugy-LA-03…
G
In summary; AI has a ton of risks and it scares us. Our solution is to invest ev…
ytc_Ugz6xPaIr…
G
But we creat them, we make them how we want. They do not create themselfs, ai ma…
ytc_UgyEIzT-a…
G
22:30 She is going too far, she is not taking into account other factors that le…
ytc_Ugz4cBJYa…
G
maybe AI will have a bank account and buy goods and services made by AI companie…
ytc_UgzN_7gm4…
G
I use Copilot in IntelliJ and haven't had any of these problems. The auto comple…
rdc_nlzdq1b
G
They just think our lives are easier because the U.S. government got GPS satelli…
rdc_ig291ti
G
No because humans can admit when they're wrong or don't know something, an LLM s…
ytr_Ugwm6AHpA…
Comment
AI is literally writing its own code, rules and laws that is how it works and this is why Humans do not understand what it is doing. AI is already smarter than the majority of humans, within 12 months it will be smarter than all humans combined.
I think the guys working on AI though are a little ahead of where your IT skills got to
youtube
AI Moral Status
2025-06-09T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugw0U6jEbY3rslkTf7d4AaABAg.AJ9MfysmpgRAJ9NsFuy2i0","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgziPN2p7wScdhonm4J4AaABAg.AJ7hIUmn-1lAJ9-tqlFT3r","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgziPN2p7wScdhonm4J4AaABAg.AJ7hIUmn-1lAJ9tS8aP0sZ","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_UgyYsaDNo4Y5lrCj16t4AaABAg.AJ77jXhrnH-AJ77xRSCeqC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugx2CEcKrllfillr61N4AaABAg.AJ6XU5jswQuAJUrTTssSAD","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgwRk1PXNF_6LV8_KE94AaABAg.AJ5TXHR-57EAJImHLjzLsC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgwRk1PXNF_6LV8_KE94AaABAg.AJ5TXHR-57EAK0f5DOnJ4-","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugwp3vy0lPkHrAnisLV4AaABAg.AJ5B2s_fdbwAJdRfYM3K3e","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugwp3vy0lPkHrAnisLV4AaABAg.AJ5B2s_fdbwAJhfCgdGh1D","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytr_Ugxz7z5KugbHy6xCwaN4AaABAg.AJ4xp3hzD0eAJZQEYFhic9","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]