Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@CorvoTanuar While I get where you're coming from, I do believe this can be onl…
ytr_Ugw6dpDgD…
G
I actually asked Google Gemini something along these lines, and this is what "it…
ytc_UgwT4GLDV…
G
I consider myself a fan of technology, but I'm more so a fan of human kind. Don'…
ytc_Ugj3tGKii…
G
@Nowheretogo9999 How is it wild? It really isn't art 💀the AI is soul less so I d…
ytr_Ugwi9TaMb…
G
Hmmm personally i use Ai to calculate my macros & kcal's, to double check my own…
ytc_UgwB9Suoe…
G
AI regulation is absolutely essential and Trump administration banning it is a d…
ytc_UgwvoIJrC…
G
Mostly copilot here.
At current pricing, for enterprise, it’s mostly $20 for c…
rdc_ohx2tml
G
ChatGPT often sends me into rabbit holes using it in my home recording studio wi…
ytc_Ugw1yuRp7…
Comment
+Rusty Rivet Lmao I don't think I have. It just follows logically, I guess. If AI became sentient, it would be far superior to humans. And if it decided that humans weren't worthy of existing, there'd be little we could do to stop it. Just look at human history. We were extremely bad at coexisting with our own kind. AI will have absolutely no qualms about that. I guess my point is that as far as history indicates, there can only be one intelligent species on earth. The rest speaks for itself.
youtube
AI Moral Status
2017-02-25T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugh5JFZ79nf9MXgCoAEC.8PM94Huv6Pp8PMhXN6Qf_B","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UghdMxvyt73s-XgCoAEC.8PM7yBMbiAq8PM84E3ANux","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgivNXalcHA7u3gCoAEC.8PM7vj9aeK08PM_wkEqDaQ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgivNXalcHA7u3gCoAEC.8PM7vj9aeK08PMbWztYuUL","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgjF9I1mY-z9s3gCoAEC.8PM7nA3oydp8PM8JYHmcuS","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgjjbV9fQpd1ZXgCoAEC.8PM5wHcXJ0I8PMVLPIqoz8","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgiWuhoLq2MVgXgCoAEC.8PM3uo2dudN8PM6o-vqWBe","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytr_UggD3CaovmmoiXgCoAEC.8PM3Y8Lpf818PMpH_Li4q_","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgioLUGTrqCJbngCoAEC.8PM0ZDS8fnq8POhFeT4dP0","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugg7JvT5Ke9_Y3gCoAEC.8PLwN_QkUAW8PONhuXe8V0","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]