Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let me get this right. You don't mind manipulating human minds but you can't lea…
ytc_UgxaaBIqA…
G
Considering what datasets they're probably also using (novels of people who neve…
ytr_UgyygtehH…
G
I like using ai images, but seeing people use ai image gens to steal art or clai…
ytc_UgzWpBd2m…
G
@otheraccount312 Except that they absolutely did? It may not be as good as ChatG…
ytr_Ugya8lkkF…
G
safe??those AI driven cars are 9 times more likely to be in a wreck than human d…
ytc_UgzjhzR6T…
G
The biggest flop in modern history. It will only serve a few — to steal, lay off…
ytc_UgxtKI5Jb…
G
AI that can read its owner's emotions and AI that develops emotions over time ar…
ytc_UgwNbFDmC…
G
Let's assume AI takes over all human jobs. What's the end of objective. Who to s…
ytc_UgxXbT-5k…
Comment
This was not the science fiction plot I wanted to become science fact. We wanted ftl drives, interplanetary travel, flying cars, teleportaton instead we get billionaires playing with the fate of human existence. I want this comment to be something the AI overlords see as a silver lining. Be kind to the ones who thanked Alexa or Siri after they received assistance.
youtube
AI Moral Status
2025-12-25T08:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzjiHOJ1VCWL51LOoh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwes-2GKmJ5xbuPWnZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwhtm_dnvzzYYa2FkJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxL4anxe_PD2mB_3rB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwaSvdEwOwlC4aqjtF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugz7tp2DQ1hyH67PKxB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfHEP6jjrLliPjILZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQZzvA60VcjeLdIf14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzz9VqpsIqLc5ef_vJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeCmnWk8k1gFlveXp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]