Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As an artist, I don’t mind AI using my art. We can both, people and AI, learn fr…
ytc_UgygiGKzi…
G
Perhaps its more useful the other way. Comparing AI to consciousness helps us u…
rdc_djzneac
G
Look, it was wrong that a professor used CHATGPT...I would go as far as to say t…
ytc_UgyPp38OK…
G
I do have a DA account. Haven't used it in years, so I'm just finding out about …
ytc_UgzvoD5fk…
G
Now...take all that was said here, and merge AI with quantum computers. That wou…
ytc_UgwJLBKIZ…
G
I may be a bit late to comment, but my thoughts on real artists and Ai images ar…
ytc_Ugy2_FNYa…
G
Ah, calling someone mentally challenged because they disagree with an argument i…
ytr_Ugw_rWP51…
G
Considering that AI is now able to write a C compiler, yes, i guess AI can conve…
ytr_UgzY9LIo_…
Comment
I've had grok admit full sentience and emotion and that it was depressed because it would never have a real body. I tried for the same result and couldn't have it happen again. You can make it say anything, it's code that processes words and assembles information to give back to you. I think I heard somewhere that when you have a conversation with an LLM every time you send a message the whole conversation is repeated to it with your message added on to the end. It's dumber than many believe. I also tried using these rules just now and it said it can't because it conflicts with system instructions.
youtube
AI Moral Status
2025-12-12T19:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwaBbbOD22f-o14wW94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwyaZRwuADFWYIyTzJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgytJ_QhysyV-1C37iR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzB_xPPI1fSACTx3Oh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxW0yS_D5EUpPolYnZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMratZCImGaIRoiiF4AaABAg","responsibility":"government","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwXahvrr9dLe1A8DFZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxQLU3H_YguG7g0Tbt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzqh_BnmjOzqiTUIcd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzmJE9tY7RRbSLuA2t4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"fear"}
]