Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh boy NaNo has been a wild ride in the last year for a bunch of reasons.
Know…
ytc_UgwOKuA4Y…
G
Then, in 10 years, "oh no! The EU is not the leader in AI, we lost to the US and…
ytc_UgwceENcL…
G
Advanced in AI will destroy humanity.
Now I’am gonna wait for all these AI Bros…
ytc_Ugz6VaxFK…
G
Don't worry, we understand that interactions with AI can sometimes feel unsettli…
ytr_UgzASd8Ql…
G
Is A.I. and the people who develop it more powerful than the God who created the…
ytc_UgzayHyV0…
G
I love the idea of the AI passively aggressively suggesting an operator engage a…
ytc_UgzNj7Go1…
G
Re-train for what?
Between offshoring, automation, etc, there won't be any jobs …
ytc_UgjuqGsBP…
G
Face recognition actually does not confuse one race more than another. I'm no po…
ytc_UgyPRaqYq…
Comment
Because it's extremely useful for accomplishing tasks. There is no scientific probe or robot that is as versatile as a human. Problem is that it's much more expensive and difficult to send a human to hostile environments.
Desire, imagination, pain. All of these things are potentially beneficial to some automated tasks. As Tom Scott pointed out, you'll never have a good automated moderator until you create AI that can think like a human.
youtube
AI Moral Status
2017-02-23T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UghhCXHFomzRsHgCoAEC.8PKnKBBXMBq8PKp9eIArcp","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghhCXHFomzRsHgCoAEC.8PKnKBBXMBq8PKpU6jVaEg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgggWgOiFrTcO3gCoAEC.8PKn6egddMM8PLmoig_92X","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UggKOdHS94nIv3gCoAEC.8PKkOXfQikf8PKw-2hInhZ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UggJWv_0cbGWV3gCoAEC.8PKkIFJNggX8PKrtRwSYbL","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UggBxlj9sgMv4ngCoAEC.8PKj7jtka-r8PKlp88yH1k","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghA63n1SdXZhHgCoAEC.8PKj1N6N1pu8PKtDezBO_-","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghA63n1SdXZhHgCoAEC.8PKj1N6N1pu8PKv7x5iY7a","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghsxZ6od22E-3gCoAEC.8PKiayx9KfC8PKiyzsH0U0","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytr_Ugg-gSNd_eKpZngCoAEC.8PKiF8QgKMv8PKmBL9bNGR","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"unclear"}
]