Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI deployment is not just about the technology or electronics itself, but about …
ytc_Ugy_WsPbz…
G
Robot have human brain implanted. So the there is no sky to be a limit. They can…
ytc_Ugwc0WSbG…
G
I think this is brilliant and I.Wish that I had been in a school like this growi…
ytc_UgzAVLBQU…
G
We dont want AI nor computers and drones doing humans jobs...creat le a time mac…
ytc_UgzGRmjIS…
G
@gamejedi Well if we try to make AI to be as close to humans as possible, it wi…
ytr_UgzOiGrPF…
G
With deep respect I find Mr Hinton's understanding of modern lifestyles and tech…
ytc_UgyYIWW4p…
G
People are dumb Just break her She won’t be alive after breaking her the world …
ytc_UgzCV9Ko-…
G
This guy is seriously smart, he used the fear of AI's ala Terminator to talk abo…
ytc_UgyaqHdZA…
Comment
I say we befriend the AIs so we can both learn as a collective power in the universe. As for forcing robots to do hard labor? A simple question with an even simpler answer. Do not program a smart AI into the labor heavy robots. Sentient AIs should be used for purposes such as companionship or scientific or strategic innovation. NOT labor. Why program self awareness into an AI designed to mine for diamonds? it's entirely impractical and could result in a Skynet like scenario if they realize the horrible subjugation they have, yet also required to function as a technologically advanced civilization. I imagine that sentient AIs would find this method practical as well because robots are stronger and more efficient then humans to do such jobs. So long as your not stupid about it and make the first self aware AI for military use only then we have little to fear. At least that's my take.
youtube
AI Moral Status
2017-02-24T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ughnexcsmb3x6XgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggvblKpw1_kgXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjVEzS6w8goNXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggfO1G2FHfPI3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjcsyISv-nG-HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugio6zncMOKloXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiG4UILVf0E13gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggRzJZbiuIY0XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgiAg_hJ4iN9Z3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjOBOe5JgVz5XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}]