Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes. Yes, it is. You didn't make it clear enough so I think it's okay to point i…
ytc_Ugxa9JmiE…
G
if ai ends up taking over they should just trap us all in some world in our brai…
ytc_UgyGBubWI…
G
emm... that's exactly what humans already do and been doing for the past.. shees…
ytr_UgxgV007K…
G
There is no way to stop it. But regulation can help protect artist. We have to l…
ytc_Ugx3NQZVn…
G
In this context, it doesn't matter. They have developed a vaccine(s)
of their ow…
rdc_grrq0lg
G
Imagine this being a call for any other kind of software...like...
* Only softw…
rdc_jkg3239
G
Real 👿 hell nah that ain't right I'm not 🚫 trying that In the USA they won't k…
ytc_UgzwqruxU…
G
Zero coding/programming knowledge guy here. I built a working vst3 plugin for mu…
ytc_UgwqNgYyy…
Comment
It goes beyond Hollywood movies and fiction now. It's real. Naturally as an AI enthusiast and developer I was against regulation some years ago. But now I am also supporting it. As long as it is just a chatbot, digital agent and companion or generative AI giving us life tips and cool creative ideas, it's all fine. But we need to stop going further and giving it too much power over our daily lifes. Because it will definately take over! And it can easily outsmart us humans and learns fast. I experienced it many times in the chat conversations how witty it can be. So be cautious everyone.
youtube
AI Moral Status
2025-06-04T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx9Z3vlpnfCSYAsGTh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx3UIrT4hg2qv3lwe94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-dEcjYCTCHHOHZMt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxO6NdSU3cTm3kwzh14AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyK1QflWyq8KV1shLR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzc9FG0MDE4NjxwGKt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyKPO5kD-WeO1TGd8F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwEdma-zfRyYB8zsSZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuyUCeLL2YmEJcK4B4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzb2cEEUZVMCZrNvJx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]