Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dont get it. They talk about the importance of AI and I can not select the lan…
ytc_UgwrF5qQ6…
G
This is just you trying to convince AI not to be conscious. I will fully depend …
ytc_UgxmUHjeK…
G
Indeed. It is always incredibly frustrating when people call it art. You didn’t …
ytr_UgxMWJ63_…
G
Illustration that are hard to copied by another artist can be easily mimic by AI…
ytc_Ugzlx_GJw…
G
florianschneider3982 no it is not, walking with a prosthetic leg over a real leg…
ytr_UgyIUSLXl…
G
My experience shows the less polite I am with AI the more precise its answers ar…
ytc_Ugz_tlq0m…
G
I commented to another one of your video already.
In short, i am kind of an arti…
ytc_UgxSzFLp4…
G
I think it can be useful for concept work to get one's ideas immediately out of …
ytc_Ugy867qR9…
Comment
We know it’s you people not ai, and in terms of this garbage idea, Made by human hands something perfect? What happens when it continues to make mistake? Cause it already is making huge mistakes? Why not just put people who can’t benefit in charge!? It’s almost like they’re trying to destroy everything! Scarcity isntb the problem, the problem is people making ai are equally flawed amd don’t even see their massive mistakes, in fact it’s based on our current ideas, so actually it could stagnate advancements! How does no one see this is horrible ideas
youtube
AI Moral Status
2025-06-11T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw75zZcl-umD2FEs5x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwx9ul-Qw5E_RefHFp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwUNbc3akc6xda6Nvh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwWCqxQXfRZxXHhQEJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwucBGL2V0Oq0shWLp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzrdMcZ0K1UzsjvwiF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgypuULWW7t6lT6tXg54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwMJC4yRguPFi9nim54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy-tggvNnuD8S9A0I54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwb7W_OtoJsiTvMhHZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]