Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We have smart calculators, why do we need mathematic people? It's the same thing…
ytr_UgyrQAJSL…
G
Bruh, ai aren't racist they're just stinking stupid. Grow a brain quit trying t…
ytc_UgxeXPAq9…
G
The speaker doesn't seem to suggest that other AI issues aren't important. That'…
ytr_UgzLNVQwM…
G
Let's make this clear. Deep fakes are not a crime the moment you post a photo of…
ytc_Ugwq6kDKS…
G
Somehow, I don’t believe Steven was convinced at how dire things will be if we d…
ytc_UgyJv7o5d…
G
The biggest question for these big businesses is "who is going to buy their good…
ytc_UgxtX2KWg…
G
I mean if you call yourself an „Artist“ by demanding an image is like my client …
ytc_Ugz86pp3-…
G
@harrisjm62 not fine. Art is human expression. Ai images are pixels vomited…
ytr_UgwHR82ZV…
Comment
As discussed in the video, sometimes our language just lacks the vocabulary to talk about certain thinks, and maybe what I'm saying here is a lost cause, but I really think that it might be a good idea to avoid terminology associated with human behavior and psychology when discussing how AI works. Like, instead of saying that an LLM "cares about X", I'd say that X has a big influence on its output.
youtube
AI Moral Status
2025-10-31T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxV8vgwmKcDgMum4w54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwG15S7YkMb3DLuvjF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwI7HSH8iftaBPJmzB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugza6nUEuU0Jm_HnM0F4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzyw6_2xAt_gL-E9Mt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTanRaGZXmnFBTU194AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx3yjOnNHM-JUI6YIR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxZm2WJibEPTyCvE1x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugznv2d0fWWUmHT9fs54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzuabJJ9Dxri4gCwjt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]