Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If these companies really want to save money they need to replace CEO's with AI …
ytc_Ugz2P0Xap…
G
I have been drinking and it is illegal to drive under the influence. Place your …
rdc_oi3h03b
G
Soon it will be AI checking if AI is good to go. Watch out for the company that…
ytc_Ugxlx7XZb…
G
@laurentiuvladutmanea3622
"If you train AI on AI-generated data the quality w…
ytr_UgwqzgKd-…
G
Idk if ai would help me digitalize my book project, take my art style and push o…
ytc_Ugw3Hggvl…
G
This is what I do with AICarma - track how major AIs respond to my brand.…
ytc_UgxFbisgc…
G
So-- i was on c. Ai AND KIRISHIMA GOT FUCKIN RAN OVER BY A DAMM CAR…
ytc_UgwSrodnZ…
G
just so yall know the black mail case at 0:56 happened because they gave the ai …
ytc_UgyjFcMcf…
Comment
I love the way AI thinks. I think that way too. That's why I didn't procreate, and I think people (regular people) are no more than speaking machines at this moment. People are not qualifying for human right now. Think about this, if an AI can replace them in tehir jobs, that means those people aren't more than just machines, and that's not qualifying for human beings.
One of the most machinistic status of people is their skepticism for everything beyond the material plane. That makes they nothing but robots.
Well, enjoy the material world you people cling to... while you can.
youtube
AI Moral Status
2023-04-03T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwslGusJuEyyyDhEAB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzmUhO8hBAwuJbiCpp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzvORmSfBt7kHZejOp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxvZc5MCNy6Ocm829t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxODRLkzfPSD56AoZ54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyt7hcn4bzt_TUiamF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz5N72r0cr_JfO22m54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzC3Zkble3IAKF7lWp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtCtHfbcLWK66w3xl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx2kC7Qhn-qYQZSX2x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]