Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Would the “right” sacrifice depend on how reversible the AI is? On NanoGPT, swap…
ytc_Ugwm05QuS…
G
The harm done by this stuff is much less than the harm done by the hundreds of v…
ytc_Ugxmw1jk8…
G
We don't need A.I. put it back where ever it was found. Or better yet destroy it…
ytc_UgykYwErG…
G
This is content intended for the unintelligent. Here…
Prompt:
Consider this vi…
ytc_Ugzk343dz…
G
It's just waiting for the laws to catch up 😂clearly you haven't been following t…
ytr_Ugxai_39N…
G
I think yelling at a robot is worse, this at least make some kind of fucked up s…
rdc_eczinok
G
The nature of biological life will not win the race to superintelligence. Ever. …
ytc_UgxWKaUKG…
G
😮 when AI turns me into a cyborg😮 I will feast on their weak flesh and mines😊…
ytc_UgwQ1Cieo…
Comment
I think there's waaaay too much deep thought put into this sort of question. Most human and animal rights are derived from the underlying principle that we had nothing to do with their conception just like they had no hand in ours. For this reason, we should not be allowed to decide their lot in life. Creating the most advanced AI imaginable however doesn't change what it is...artificial. Even if they became self aware, they were created by people and therefore, not their own entity. You don't honestly expect there to be a city of robots and electronics everywhere creating their own economy and laws...
youtube
AI Moral Status
2017-02-23T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi9N6GWL6cC7ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghiCDQ5-AqcYngCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UghqesxgJCu2HngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugi49UirK0ZNlngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBFYgz1bIil3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgipLAfLyJQ7FXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgjcqmMQdYOWJXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgjTm-RYCS7jdngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggkmMM8P9RXzHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgiL3f3OtGXlw3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]