Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Make it so that they can either save one life, or (your idea), and at the end te…
ytc_UgwMVscpJ…
G
So far I have never found any LLM to be particularly useful, which is why I only…
ytr_UgwJm-OAr…
G
0:41 girl no one believes that AI version of Elon Musk is real! Everyone knows h…
ytc_Ugx3luRyP…
G
Can you imagine a world in which AI does everything in order to make people as a…
ytc_UgyW4oazo…
G
Thank you Ma'am for highlighting this scary and real threat.
AI Deepfake can cr…
ytc_UgxNQmu-v…
G
I’d say yours will go just as fast if not faster. There are already memes about …
ytr_Ugx1O73c1…
G
The biggest problem we are facing right now I think is that people don't questio…
ytc_UgxJ1zryn…
G
asking AI itself has spread fear. They pretty much say that they will still fol…
ytc_UgzA0mlSI…
Comment
36:33 Can't believe I'm saying this, but I'm kind of on the clanker's side here. If you placed a rock near the lever in the trolley problem, you wouldn't say that, by not interacting with the lever, the rock is making the moral choice of harming the five people, right? The rock doesn't have the ability to choose, which much be a prerequisite for the choice to be a moral one.
Likewise, an LLM doesn't hold moral beliefs, it can only reason about beliefs in its training data. So while you might be able to procure a rock with the letters "I choose to pull the lever" drawn on it, that doesn't make it any more able to actually make a moral choice.
These videos are really interesting, but they kind of amount to technical ways to "draw the words on the rock" as it were.
youtube
2025-10-06T19:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyKP4uR00qGCyn6LuJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz0J6GCUL_lgAM0nml4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgxcUuA4OHpkahfvF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw_FU3JGW8agMF0gIl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzQC1bPz-9zatqpC4x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx1_jQ9gdA_nhiITGx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxdPpsmwAObwOBUEWp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz5dmHtMnYsQ40DWGZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzYk88DBjpJrKyokTd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwSm23SLiASIaNkDEV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]