Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Work with the government ?? You mean work with the world mafia which has already…
ytc_Ugwdi0MNf…
G
if thats the case then I better start saving for my robot just in case when my g…
ytc_UgzGZKq-Z…
G
Bro don't make Robert 😡😡😡😡😡😮😮 the robot have no heart and no feelings nd break h…
ytc_Ugz5C80yY…
G
You’re in the middle of having a conversation with your bestie and suddenly they…
rdc_njhj339
G
90% of Anthropic's (the creator of Claude) revenue comes from other companies. S…
ytr_UgwcMKwjp…
G
"AI in and of itself isn't dangerous" - well you're right from the standpoint of…
ytr_Ugz8s_MX-…
G
@matthew5818 and AI is a tool being used to create and do things, just like comp…
ytr_UgzlvV0iM…
G
Wait until ai starts controlling over them and tellig them abd shoeing them the …
ytc_UgyuR8ywX…
Comment
This is all AI hype for investors. Tech companies have got to make the LLM look like it is dangerous and thus worth investing in because it must be good for something.
If that sweet sweet investor money stops flowing the AI bubble will pop. Somewhere between March and June 2026 we'll run out of investor money at the current rate of investment. Then the investor money will all be circulating between OpenAI, Nividia, Oracle, Microsoft, etc.
Back propagation isn't an alien intelligence. It's a collection of weights that maps tokens that make up words to the dataset it was trained on. It can only do what is in the dataset. That dataset is from all of us.
If you don't like what you see in the mirror don't blame the mirror.
youtube
AI Moral Status
2025-12-11T06:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxLCGkhNhTFD8jREnl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJxkGr2FotEWmzXmV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwgXJdvLm5jPDrJBH14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwG1Y6c9f1wSrCAUxJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxPfwUWnygdoI6tsIZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOQP7tTc2nagjXe1t4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwI6bvnykGNxqYE26V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwoaFq3-LdXVOaQV0B4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbxZ8PFMiWpxIDT9x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugws_BWqx_6bfx2sAdN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]