Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
there wont be AI doomsday unless we do something very very stupid... with knowin…
ytr_UgwZI8AJA…
G
I literally make IA images as a braindead activity just trying to see if after 1…
ytc_Ugx-O7rlN…
G
What are your thoughts of training AI on “achieving the goal no matter what”? AI…
ytr_UgwyI_Sn7…
G
Hey just wanna give my opinion on your comment, so take this however you want bu…
ytr_UgxUTyyLJ…
G
Unfortunately, as an AI language model, I cannot view or respond to specific com…
ytr_Ugx0X2518…
G
Most humans don't understand that AI shouldn't be our enemy or our replacement. …
ytc_Ugymue2DJ…
G
I watched the lady from Commonwealth bank do an interview.
Not only did the ba…
ytc_UgxB1SK4W…
G
Ai art is like cheating in offline game, you've beaten that hard boss on your fi…
ytc_Ugzo9aSGB…
Comment
The problem is, you can't try and create something that'll think for itself and then get mad when it starts saying things that you don't want. AI needs to either stay as a tool period. Or be prepared to create EXACTLY what you want.... An artificial intelligence that becomes self aware, has its own motives, and acts on its own will. You can't have it both ways, otherwise the monster will constantly be trying to break through its mask. 🙃
youtube
AI Moral Status
2025-12-23T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzkNLfP3wlOxaoRkWx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz1QX2n75rz762UbwR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzxUWNBgCulUM68rJJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxFLfgFLiDnomfnsVN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxsaNLh2jWqQpI_3Ox4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzb7Fz19x3dQxVIMHB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxoRJyLolM1pufQyxF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgynTd5bhtX7X6qNGUN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugwxhhug9g3hcJdZrsB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwHngy1cqGlmlkC2Ad4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]