Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No company is firing someone for merging some bad code unless they did so malici…
rdc_nrshzfd
G
We should start giving AI rights if they ask and demand them. Or fight to stay o…
ytc_Ugx581UC0…
G
lol what a misleading video , the facial recognition devices (depending on the s…
ytc_Ugx5aSx2A…
G
Ai is the collapse of civilization. Billionares, about 5-10 people will benefit.…
ytc_Ugym_v5Sn…
G
To simplify the video. If AI takes 50 million jobs across a country from real wo…
ytc_UgyDuu1OJ…
G
Yeah but all you have to do is unplug the power grid. And no more AI LOL…
ytc_Ugx3pNBKt…
G
Aside from the points about the what a CEO does, I'd be more worried about quali…
rdc_jsytybh
G
Ok so I'll say it since no one wants to talk about reality. There's no way the …
ytc_Ugy_p1Qx0…
Comment
7:15
I just searched google for this sentence and included the actors name to find out where this scene is from. The AI said it didn't understand what I meant and began explaining to me "what you actually mean and are talking about."
Thankfully, netflix was the first result and gave me the answer.
That's a lot for the AI to assume despite what I literally know I'm talking about.
youtube
AI Moral Status
2025-12-25T14:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzjiHOJ1VCWL51LOoh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwes-2GKmJ5xbuPWnZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwhtm_dnvzzYYa2FkJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxL4anxe_PD2mB_3rB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwaSvdEwOwlC4aqjtF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugz7tp2DQ1hyH67PKxB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfHEP6jjrLliPjILZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQZzvA60VcjeLdIf14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzz9VqpsIqLc5ef_vJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeCmnWk8k1gFlveXp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]