Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"By passing Human awerness.." .. that's Matrix..they have create an awful world.…
ytc_UgwNOs1MT…
G
Ai is automaton. It's a mirror of emotions or trained into concerns like a psych…
ytc_Ugzrlfoql…
G
We need a bill that makes it illegal to make AI art from art that isn't yours.…
ytc_UgxyuWXsL…
G
24:34 destroying the world is bad, but destroying humans is where it might find …
ytc_UgxSUPF52…
G
Dang, some fantastic ideas here. Probably some of the best I’ve heard on this to…
ytc_Ugz_IZBt8…
G
Its weird that Lidar cars can basically only drive in tiny pre-scanned areas, bu…
ytc_UgzDKklm5…
G
For the atheist, there is no hope. But for the believers, we know that AI can ne…
ytc_UgxdIvo5g…
G
Autonomous vehicles should be only for private cars and vehicles that operate in…
ytc_UgzaAtoc4…
Comment
Elon Musk stated that AI will be a greater threat to humanity than nuclear weapons. This is the subject of many science fiction films. But why on earth hasn't Isaac Asimov's robotic laws been the rule for all systems with AI? It should be implemented in every chip and artificial brain that a robot CAN NOT EVER do harm to a human being or to mankind.
youtube
AI Moral Status
2022-12-18T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwdI4C2mjV5pr6jwF94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwhEUm4KbOpf6PyPDN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzebOlxIHvnyHlhIpB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGVXvoxxEQ4cqfI414AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw08DAL-aV5k96-lmB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgykruotFbBxUdmDETx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzWdU26SECWq8JAo-F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw2RHi2ufxNWOw0KY54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx0xoDwG-eEblipYyB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwbjrCHNXUHkVvTish4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]