Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can sora draw the EXCACT picture you have in your mind? NO OF COURSE NOT! YOU CA…
ytc_Ugy060wVr…
G
The funny thing is; Ai, is working and we aren't. That means, why make a product…
ytc_UgwRX9PMM…
G
I have never used it because I believe AI lacks the emotional depth of human art…
ytc_UgwKoWaHk…
G
The UK should develop its own search engines and AI portals instead of using and…
ytc_UgxxVQ5xM…
G
Why do people create robot and AI? because we need labour to do intensive and da…
ytc_Ugx10Sywt…
G
ChatGPT helped me find a book I read umpteen years ago. I didn't fully appreciat…
ytc_Ugym1x-ks…
G
We need to keep the human in art. What is the point of consuming ai made art? Se…
ytc_UgzRYRB3L…
G
I don't think that we have to wait until robots demand rights to think about and…
ytc_Ugi_gGDWU…
Comment
Things don't always happen the way people imagine. But they can happen in a different way. Consider the case of the new data center that Elon Musk's company has put into operation near Memphis, USA. It uses electricity generated by dozens of gas turbines that pump tons of toxic and carcinogenic waste into the air, near an area inhabited by poor people. It seems x-AI is a dirty version of Skynet. In the Elon Musk Terminator Scenario people won't be killed by killer robots, but they will be killed anyway. Slow and painful deaths are being inflicted on people exposed to pollution in the vicinity of x-AI's polluting Colossus.
youtube
AI Moral Status
2025-04-27T13:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxwHvcRxZ1uMgkEjfl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzsLemhJ8IWYet3mPh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqWEsvuRzl2B7M0Md4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxNd9OTtZVaEyZijwF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgylubE-3kYPpc6iLEZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyzGwDAkFVaTnL5h-l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxk9U99eMvKsjNNlhh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwEs8bW-vVDwY78FFV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgykyjvurmDaoihK5c54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugz0OyaE4rfSFkInbvV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]