Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ethics are not ONLY important with AI but also important with how employees and …
rdc_gm3vd7i
G
As a person who used AI art...well i literaly used it as a filler for art i I co…
ytc_UgyAfCRz7…
G
A super intelligence would need to have a completely automated supply chain beca…
ytc_Ugx0BtiYI…
G
I can’t really explain how I asked how to make a homemade atomic bomb and the AI…
ytc_UgynnDgbn…
G
Are you guys really this dumb? It’s so easy to just ask an AI model to roleplay …
ytc_UgwBOFfD1…
G
Having just fallen into a Steven Bartlett DOAC filled hole of interviews with do…
ytc_UgyDmLiKi…
G
@hyphydan I don’t disagree, but what unites people more than anything else is a …
ytr_UgxzNGNxB…
G
@drachefly We're already in deep trouble. China and other bad guys are using it…
ytr_Ugw-vK8R4…
Comment
The vision he saw(scientist) (20-years-from-now). He can't assure that they will be friends with us. Someday-somewhere, one will change definitely and turn all among us..and then 'I Robot' will happen for sure. Don't make them and if you do, then atleast don't make them physically more strong than human. For example: Make them with no hands at all.
youtube
AI Moral Status
2018-01-14T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyKc2VG12WJKMBHRFZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwD4tIZv8o18ezmRV14AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzzNBc2crh9fGhNvWN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyDCSRmlAlGz7A7W1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSSEV02wjwwIO_RGN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw_4kBgq2BY87aGGx14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvCX4-v23Bij5jynl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxqiDGZleUYBetiwYJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyC3D6oLzlalKr5RXR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzNQ9VYKtNTLRqaAzx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]