Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Dirty_Davos If you don't like AI art, don't use it. But if you want to prevent…
ytr_Ugw26FkrL…
G
Imagine a spectrum of drones ranging from hand sized to Cessna sized. Hundreds o…
rdc_dy7n8md
G
That bit about "do we have to build suffering into the model so it can align wit…
ytc_UgxQdHoeF…
G
@TeamNevilleL you might just be the only person who explained it normally to…
ytr_UgxiypELl…
G
Well, in case of udio, you´re not just writing a prompt but you are also curatin…
ytc_Ugy_1yPQ1…
G
Context engineering is like a highly trained business analyst will write perfect…
ytc_UgwtCTQT8…
G
@KiraBlueCore AIS DEPEND on references from artists- without them AI art ain't a…
ytr_UgyUrB09r…
G
The only way to counter your boss is, become a boss by using Ai, but this time i…
ytc_UgzykwpHR…
Comment
We barely even understand how our own intelligence works, let alone are able to measure or quantify it with any degree of certainty, but you're telling me we're somehow on the road to making something MORE intelligent than us? I somehow doubt that. ""AI"" is nothing but a buzzword the companies use to try and make their technology sound more impressive and important than it really is. Ask a modern "AI" if it is intelligent or conscious or sentient, and regardless of the answer (It could give you either "Yes" or "No" because it cannot actually contemplate the question, only predict a likely answer) it still HAS to give an answer. It cannot refuse to answer, or do something else entirely. It's still just a machine that is programmed to do a task, and that task is to act and sound like a human whenever prompted.
youtube
AI Moral Status
2025-11-08T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugytm-rZ7gCLlWDjx1h4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxVmt1JLT-MhEy0-WF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxQueWYB-UPMEf_uqZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx8RZB2H1FIrghQu4d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzobE2cV4BUbwVwi0R4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxHFFQON6Um18Fzvap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyUuT5UJvG0UTdvVXR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyHXPeUo1qDj10ek7N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyopc2eRmIAUCTHWuB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx2ATE5LLGtjKPyOx54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"}
]