Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What pipe dream is that, expect a company to give money to the person it's autom…
ytr_UgyaXQkS6…
G
AI models are statistical mirrors of the data they’re trained on.
And human da…
ytc_UgyOVUtBW…
G
Aha! You almost got me, but then I realised that there's no such thing as an "AI…
ytc_Ugzshk-mk…
G
@34:00 The one sector that really does NOT want nuclear is the fossil fuel indus…
ytc_UgwmqT_iH…
G
Honestly, I see that AI "art" will probably be like NFTs in a sense where it may…
ytc_Ugx6fyT4c…
G
The comparison with nuclear weapons is flawed because they only work as a deterr…
ytc_Ugx3xuZJW…
G
Our society is Reactive not Proactive. This is why we won't get ahead of things …
ytc_UgwDAESp9…
G
I don't see the problem with being an AI artist, at least this guy has understoo…
ytc_UgxpFdaYi…
Comment
It basically comes down to who is programming the robots. Good people or bad people. And if they're all connected together, what's to stop the good robot from learning from bad robots?
youtube
AI Moral Status
2023-02-25T16:5…
♥ 27
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzFWqB4f5gRaYAPm3p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx5DrV3IGcOq6uE0wh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxflvUW9y-vTCjv0xx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgY9WsEXpK8k2bceF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyAv7YKaDo-Xv8bqWh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlMTNITbgksHdl_Xl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzQIeN_wQ1dr2YdHzp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzsLvLoV3SLtidN8_x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxnJSbYJSstKLGn9Kd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxhvkLhvqCdM1JM0v14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]