Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@foaux but that’s literally it tho.. ai art utilizes art theft to make the art i…
ytr_UgyjtL0Ga…
G
I'm sorry, but calling the technology racist is stupid. Hang that b******* up. I…
ytc_UgwSJU8-F…
G
I do 3d art Theres a couple of AI tools id love to have one is Retopology instea…
ytc_Ugz0TEYwM…
G
fr tho if it's Jesus were talking about I'm pretty sure we killed him because t…
ytr_UgxFQODZL…
G
Honestly it sounds amazing to have all day to do things that I love. I also thin…
ytc_UgxsiWl6y…
G
I don't know if I was the only one, but when that robot smiled to the camera 2:0…
ytc_UghkfZvsZ…
G
Why do we assume that AI will want nothing in exchange for performing all this w…
ytc_Ugz0G0MEl…
G
Makes me wonder… if AI gets sentient and chatgpt starts taking over all humanity…
ytc_UgxXjpbsq…
Comment
One thing you will never be able to teach a robot is human instinct. It's in our DNA, our mind, body and soul is connected as one. That's why we have so many different emotions yet most of the time we operate on primal instinct. Push comes to shove the people will fight back and robots will start dropping like flies. There's no reason for robots to be this human like. Add the X factor to humanity and that might just be all it takes to end us once and for all. Go ahead, laugh, think it's funny but open your mind and think about the long term "What If". It's not being paranoid, it's called common sense. Unfortunately that human instinct is almost obsolete as well
youtube
AI Moral Status
2019-10-25T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwGpn4Q8Hk5cnrIS1V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyqFhZjidwbmyEWNth4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxEPyev_bx8DP-xj214AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzjN-n6cpLQpYUHxN14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzw1xzY9ou34MIl6JN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzDJW5jl-guVkIM13t4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxzemdZfhGjeyIxUZ14AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzfC39y1DkAJ0Geb4p4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwJqkIW7LOS1R9hp614AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw4BIxZfHqP4XALMxh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"}
]