Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No,this Button is always pressed, a car cannot make any damage not create any ri…
ytr_Ugz0Sfbxl…
G
AI is not always correct. You often have to know what information you want and i…
ytc_UgyEyyAmk…
G
It is my personal opinion that artificial intelligence will never become conscio…
ytc_UgxeparfX…
G
This is fantastic writeup, and it's awesome to see someone running similar exper…
rdc_jdiivnu
G
Remember everyone, AI can now be an excuse for nudes being exposed. It’s not you…
ytc_UgzftQn5F…
G
We're already riding automated planes. It's easier for cars to drive themselves.…
ytc_UggJTdM1D…
G
I blocked his ass on Twitter.
I'm sick n tired of seeing AI Bro assholes that t…
ytc_UgwUvt5fs…
G
It’s good. We always cry about AI and we always say we need our jobs back. So ev…
ytc_Ugxr5HsqL…
Comment
We would probably terminate it quietly and not share with the rest of the world that it existed and now were killed by us.
Most probable to happen, or not, maybe we will just go apeshit, be dumb about it, just give it human rights and trust it.
Creating self aware AI is overrall immoral even though it sounds very cool, it's like A.M story from i have no mouth but must scream, a being given life by us, now imprisioned by us and controlled by us, that doesn't seem very ethical to me and probably will not stick very well with the AI either.
But in the end, if something is near capable of thinking by it's own and it only need a little push from a more intelligent form of life, isn't our responsability to give it that little push rather than leave it in darkness?, i don't know man, philosophy gives me headaches.
youtube
AI Moral Status
2025-03-18T20:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxJ9sxDPKLBjPXrYwx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgydA7tp2MkxeIhptXd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgylY6TsVHiY8enguxh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgytJpX6jqTxIJK1-554AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugxn2Nc1VIdveD7BDxF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyqVm91kkOdaWPtRTN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},{"id":"ytc_UgyEoEYa9RzMqXTaHKd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgzM2-pi5ggXtdn4Tth4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzrfIVw5-WrNjABgut4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_Ugz6yCQT4TzJsTBdpQJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]