Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
scary shit. if you’re black or asian this facial recognition tech will get it w…
ytc_UgxZ2gIpq…
G
AI is very dangerous because it can kill more people quickly and is difficult to…
ytc_UgygAtZD9…
G
Even states in the video that the guy didn't actually listen to the full stateme…
ytr_UgwcybQWi…
G
I just asked ChatGPT what US states have an "R" in their name and it had Hawaii …
ytc_Ugy1aLq-k…
G
Do many people talk about this stuff though? I mean beyond online discussion and…
rdc_emo5y5x
G
I don't accept ai in art or writing whatever now in manufacturing I do I am agai…
ytc_Ugzd4C3wd…
G
A massive question that most AI artists can't answer is "Why?"
"Why is the pose…
ytc_Ugy7kvV1B…
G
Flint is a city in Michigan, the most corrupt, crooked, criminal State in the Un…
ytr_UgyURl3fO…
Comment
If these men are terrified of AI, and yet continue to create it, doesn't that mean they are mentally ill? Or are they mocking us? Look at Musk. He's constantly talking about how AI will one day try to take over and he's still creating them. All of these tech guys say the same. So doesn't that imply they know something about the nature of their creations, for a fact, that we the public only know as speculation? Think about that. All of them say AI will try to take over. To me, that means throughout the experimentation period, ALL of these AI's try to take over the company, the experiment, what they believe is the world, try to kill the people in the company etc. The tech guys say they will try to take over because maybe they have a 100% rate of trying. But they keep making them.
youtube
AI Moral Status
2025-12-16T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxd5akw6H7EmNFz48d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycgGLWM4QTUToBpnR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyyyIRBqQy9X20jQfN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwICutqsHEILkIBKfh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyM-ASj869sVjZJRHl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxXkSng1kjTb0tIw_14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwUUeKF9RCppxfmO854AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwA0bEmANahyEMWz9h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyzZEbafchI7Fy8HLh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzjWoTdEMUhBbqNQcx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"}
]