Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a junior(with 2yoe, so not a complete newbie, but still a newbie), I can see …
rdc_ohziekp
G
Well it's the amount of progress the AI is achieving every year that is scary. I…
ytr_UgxzDglcJ…
G
Oh, wow, so ChatGPT has a dark side now? I guess that explains why it’s been giv…
ytc_Ugy-ZTmFy…
G
Invitation: From Fragmentation to the Whole
The Root of the Conflict: We are cu…
ytc_UgzB6ktaQ…
G
AI = Employeed 🤖 happy robots 🤖
No Teachers = Unemployeed in millions
No Homew…
ytc_Ugyrv9rSy…
G
As someone who cant affort art supply i want but can use all types of art suply …
ytc_UgzeVRpEv…
G
I think Universal basic Income will make people happy aka work with what they lo…
ytc_UgyAzTc_3…
G
1:02:25 there’s a level at which I think that AGI is gonna happen whether we wan…
ytc_Ugxz4cdA5…
Comment
Regarding the creating of AI in general and the mounting fears about it taking over, Just because we can, doesn't mean we should. And yet we did.
Therefore, if we are deemed inferior by our creation and end up being hunted to the point of extinction (or near extinction so we can be put in zoos) we'll have no one to blame but ourselves.
I find it interesting that our ancestors created God to help explain the natural forces. Now that we no longer need to keep that myth alive and we're not only in the process of cutting it loose (something that is long overdue) but we have, as "gods" ourselves at some level, managed to create our own Grim Reaper.
To take it to the next level, how will AI destroy itself (long after the failed petrie dish experiment that is us, is gone)?
youtube
AI Moral Status
2025-06-08T18:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx2xMZS_lRbgwGt41p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxqkIBT3BNtCKy331l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwyhpZWj_iuQmvgMex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzx0bRSuxlj4UfH-5F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx-Bm2Ok6M4eLOeHT54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHNo2WAIolUbeo0oJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9fAJ7y90CshbfSRt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyLgGmQG3A1rSuvw3t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxmjGprV32Fw4Objg54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXWNs44mmrg-OrSnZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]