Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The minute he said Eelon had no moral compass I stopped listening to this twat.…
ytc_UgwDLvinV…
G
Wow, such a novel idea, "hehe, what if the thing makes humans go extinct"... oh …
ytc_UgxgeR_zd…
G
Simply commention to push the video into the algorithm. You can do it too if you…
ytc_UgyElJ1je…
G
So what you're saying is I should try to put every big corp in the sauce by sayi…
ytc_UgwDK0OuV…
G
Good to see AI is finally reaching the commercial point it is replacing useless …
ytc_UgxvhC0TD…
G
Looking at AI they are not pre-natural, they are Natural. Meaning of the Natural…
ytc_UgzhCxgDj…
G
“Ai is going to decide it doesn’t need us” IT DOESNT HAVE CONSCIOUSNESS OR A SOU…
ytc_UgxH7rDZx…
G
We have lost our ways. These people can't think and we have AI taking over! Wh…
ytc_UgydNnhix…
Comment
Maybe the answer is much less concerning and what we're seeing is the models referencing "lopsided", negative data because the internet is full of negative articles, spiteful memes, and lacks positive data because we humans don't react to it as much.
Since these are LLMs and NOT actual AI... This isn't a problem because LLMs are just smarter text prediction when every word is a token...
I really get the feeling people who are glued to this topic assume we're making something Lovecrafting without pulling back and remembering these models don't think, they don't remember, and they have no motive. We'll be fine.
youtube
AI Moral Status
2025-12-11T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwruxeNg8pgFYUX93d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzih2g3jH7eHnWJuqx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwXFCxV6hqBM8TeHa14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzrio2y2cMR19VH3nB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwagUsWfNqV2Jtp9rR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyHFqmCXPUyNul_zQV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxx84Gw0qNcSJsfBV14AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxtEkW2ihNLwZXuzjp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgybCOBnNlnKs53o6fF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx2U2Pte5fH0kyi_nZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"})