Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I cried watching this. I haven’t gotten a single commission since this Ai thing …
ytc_Ugzo5wryD…
G
Some time ago people thought that digital art is not art because it's more like …
ytc_UgxR_5rt-…
G
people use ai to get money without putting efford into it while the ai basicaly …
ytr_Ugw7ndgTR…
G
I think you dont know how ai works or you dont understand language. It just pull…
ytc_UgwIzR9VB…
G
Filter also prevents AI from getting too attached to users. Nevermind the filter…
ytc_UgwqMObw6…
G
If the scientific AI isn’t agentic, how is it supposed to predict what an agenti…
ytc_Ugwna-OJD…
G
Nice "first attempt" but I would emplore you to learn how other developers are u…
ytc_UgwpBRuUL…
G
The thing about this new generation ai called “LARGE language prompt/model” is t…
ytc_Ugw96To0y…
Comment
I really like those philosophical dilemmas. If you say "No they will never deserve rights, because we are special and smart" consider this: Assuming that at some point out of scientific curiosity or economic interests we really build self developing AI or AI capable of building better AI. What if at some point AI is as smart as a human? What if it becomes 10 times, 100 times, 10.000.000 times smarter as a human. Theoratically there is no limit to the capabilities of an AI. At that point the AI is to us what we are to amoeba. How the fuck would we deny such an entity anything? What makes us so special that we would have the right or even the capability of denying something to a concious beeing smarter and more evolved then billions of humans combined?
youtube
AI Moral Status
2017-02-23T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiFBSdhZ6OiBHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggmwyliw6Ndm3gCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjagJyEa3ihdHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggaSjYh5W4t03gCoAEC","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgjYo2NEXZe5yngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj7xHSCB362wngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggFmovwouz0T3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggXNbZRYXgRMXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiOm4edH9tF53gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiVIEHUHhcKzngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]