Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well it’s really hard people said that it’s robot or a human so nobody knows…
ytc_Ugz0eYHP3…
G
It's not just an algorithm it's also just data slapped onto other data with bui…
ytr_Ugxd-o1es…
G
We need another Luigi to "talk to" the CEOs of these companies that are trying t…
ytc_UgyjrdBR8…
G
As an independent game developer, I can say that these LLMs have made me 3x more…
ytc_UgxON7IAD…
G
As always, i never want a job, i just want to do Ralph Breaks The Internet in re…
ytc_UgywTVCm-…
G
Bad actors are already misusing AI, for example, by reusing respected journalist…
ytc_UgxkZNWCM…
G
Currently watching the video so I'm not sure if this topic was mentioned but som…
ytc_Ugz6hdfFQ…
G
in a society so profoundly sick, one could argue creating AI now will adopt its …
ytc_Ugzd3mDPl…
Comment
This is such a deep subject, we could set a shift in history just by letting robbots think for themselves [this is not far from today] and just by being our own creation imagine the things they would want to accomplish? By that point we will just become something they wouldnt really need. It might sound going a little too far, but AI is really something we should really think of at this point we are just accepting everything that has to do with technology but its time for us to really see what we are doing, we are not just creating robots, we are creating life that would eventually be superior to us humans.
youtube
AI Moral Status
2017-02-25T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgjR2zO_1LwfgXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggOs3HwjLeo6HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggzjEvQA-SVuHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ughj52dn57v5_XgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UghQ9UQVYlM32ngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjIXkiz05yonXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UghVxTy-agwO-HgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghVIe6nF4TwM3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugh_UzizPwht13gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugjn9CpVjJQB5XgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"})