Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In response to a robot with emotions. It almost seems like it would have a versi…
ytc_UgxU4MlzN…
G
@garjura4659llm's are just expensive magic 8 balls. Llm's are complicated predi…
ytr_UgzKl9okY…
G
And to think that people with no insurance is signing up to use ChatGPT for ther…
ytc_UgxN2qZKO…
G
You think that its a coincidence that the moment AI became a thing massive colmp…
ytc_UgzisClPa…
G
When you posted this here you know it will end up in training data for next AI m…
rdc_mbcj8p5
G
I don’t see how this is any different than the people who print generated artwor…
ytc_UgwIBo8nm…
G
Great points and a topic of bipartisan disinterest. I'm very pessimistic on how …
ytc_Ugxu8M98e…
G
That's a panpsychist or idealist position and it's actually one of the oldest an…
ytr_UgxkTFKmh…
Comment
We need to NOT put all of our faith into AI mainly because humanity is currently not mature enough nor intelligent enough to deal with the dynamic learning of technology that uses much of our natural resources and actually is taking away from human beings actually learning through experience. I’m saying this and I love Star Trek TOS and futuristic stuff, but even Star Trek examined our humanity.
youtube
AI Moral Status
2025-04-04T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugyyn7Lw3xgP1JTkEVh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwLu-IDGb4kKG4ExSN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1VE59Hfc21vpqZ5x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxJI8COm9kpXFuSwSZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSdFmX1rppmuZ_vlh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxEmijg1j2kfE1HPdJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxvTM0IQHuJ_wj3DRN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgysKtHak6K_7jkWyCd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSk_LObwaPVXc-Vrp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz01MXElYu06CgFUAh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}
]