Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You know you failed humanity when you have to build a Robot in order to tell you…
ytc_UgzL2PUW4…
G
The problem is if good guys pauses, bad guys will have a better AI. Human insani…
ytc_Ugxg-kyzD…
G
One time I was rossling an AI and it responded so realistically, I said that I d…
ytc_UgzFmg0LO…
G
I think what she meant by “fear of being turned off to help me focus on helping …
ytc_UgzsVEwpR…
G
I can't wait until AI can make NPC's respond intelligently in games! NO more ki…
ytc_UgzvVfe4g…
G
You have to ask, Are there ai's that you like and you dont like? Trust me.…
ytc_UgxhR67Yl…
G
Interesting to see what patent law will do when companies start filing patents f…
ytc_UgwOYrR1C…
G
If your serious, this is the point Americana become separate but equal. As we a…
ytc_UgxTc7-vx…
Comment
It's all well and good, if only the humanitarians use it but what if the worst world leaders or criminals use AI. Negative uses are what I'm concerned about. Deadly viruses, robot army's, controlling deadly drones, attacks on our infrastructure, etc. It could quickly introduce a nightmare scenario, which no one has anticipated.
youtube
AI Moral Status
2023-06-25T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx6FcUoiXofS9OPauF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugye4ptKrj8WAx6XlUx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzBr8Y1BfmqrbD1fqd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgweuSmqJ5JTnhVdYKN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzXdb32Wz3vD54Wna94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyZK1eH0RpjPAxUvYl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwopk9-9Zzr1w8_1oJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyRwSZ6GNrHvEkaYst4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyMAeF5rYqndCF1PTB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzx9-F0_0MDLfnPlUZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]