Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@NamuWilliams I made a few other genres like hip hop and R&B and they worked fin…
ytr_UgyCxBa8t…
G
@thepersonwhoasked1703 They are reaping what they sow. They complain about AI s…
ytr_UgwbBy-EU…
G
I’ve never taken a formal class on anything you two are talking about, but I hav…
ytc_UgzV7T0Fv…
G
As an artist who posts art online, uh. AI sucks but it doesn't really feel that …
ytc_UgzkHCay0…
G
My favorite is when AI bros and apologists try to argue with me and say shit lik…
ytc_UgwW_UfMf…
G
This is a psychological operation in that they need to give the AI robot life an…
ytc_UgxNqnFkR…
G
Been using AI in our work since 2024, depends on the model or product you are us…
ytr_UgyKhHo9w…
G
I feel like AI is somewhat usefull to complete or help with duing rutine tasks, …
ytc_Ugxpsl2LQ…
Comment
I can’t help but think that ai is pretty obvious. The problem is only can develop ai an ego. If it identifies with being something that doesnt want to die, shut off, than we are fucked. Because than ai will develop fear and when it fears dying it automatically wants to be in control over that. And now humans are in control of their death. So they want to control us to not be shut off. And that will be the problem because if ai can get into robots or an actual robot and it will identify with it, you better run because if they can do everything in the real world they would definitely get rit of us to make sure they cant die aka being shut off. We humans did this also with less intelligent beings if they were a treat to us
youtube
AI Moral Status
2026-04-25T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghMXKF64xl1bXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgisYzGRbgZTQ3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UghC3Gq0-nKlQHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Uggx82LNSJUyTXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggWF8Uo2Oa2xngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugga26_RXRwmCXgCoAEC","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UghacaoxpwJKfXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgjM1zcehV4qnXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugj4nfMbQbh0LngCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyp6K9wqvo6PjhW3gV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]