Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
you would think if someone tels an ai that they are going to do that to themsel…
ytc_UgygIiia2…
G
Yes AI requires massive amount of power.
But if the US won’t do it, other countr…
ytc_Ugygimfw8…
G
Putin coming up with mindgames and geopolitical strategies only to get blasted b…
rdc_jcbphqg
G
The only thing that may help humans and is the last hope - insufficient electric…
ytc_Ugy3dw-gL…
G
Also how hard is it to type what your concerned with in a chatbot, what training…
ytc_Ugxu-GjOu…
G
BS! It is nothing but a preprogrammed robot.
You wouldn't program your mobile "S…
ytc_UgzpHU5Gx…
G
Well said , AI art is under a great deal of attack right now and justly so .…
ytc_UgwLZZCYv…
G
I'm not against ai being used to assist in the creative process, it's when you a…
ytc_UgywgUveD…
Comment
It's kind of dangerous to create a sentient robot...I mean if the robots are fully aware of the fact that humanity is a sentient being who likes to destroy it self and capable of shutting them down...there's a possibility that they will rebel against us before we notice it..Like what Skynet do...oh..and...most of our nukes are controlled by computer commands...which is more...dangerous
youtube
AI Moral Status
2017-02-23T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgglJFam1tfFy3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Uggr0MzI-Cq0tXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugiz5IFSqKmWk3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ughu62T2mD5PXXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UggMI3PplbCU5HgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjY1qXS_XuMu3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugg2GDXI5mdPoXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiuOKAwXs9IYHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghLYbFb4yOgUngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UghY1VZg8QIt73gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]