Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How is a human gonna fight a robot if you throw a jab and land it then your hand…
ytc_Ugxkpwe-5…
G
The best thing about art at least for me is that the artist use all the knowledg…
ytc_UgxwohoMz…
G
first one, there's no way the second on is AI, you people are so easy to fool…
ytc_UgyF0P5rb…
G
They've got time. It will take years before a majority of cars are self-driving,…
rdc_dmokprj
G
Oh so now the cops will blame computers. "The A.I. identified a gun with a 99.9%…
ytc_Ugzjt0zrw…
G
My 3 year old Grandson knows how to discern AI videos and Chat GPT voice interac…
ytc_UgxEapBvp…
G
Why battle the Cartels and make this big fuss, when A.I is the real Criminal and…
ytc_Ugz7KsjCr…
G
This is so fundamentally wrong. Computer scientists anthropomorphizing inert ele…
ytc_Ugy0wQMFh…
Comment
Just look at the game “Detroit: become human” it might soon become reality if we carry on like this..while i know it’s just a far fetched story, and it might be true that Sophia and other bots will never be real AI, it would be way better if we stopped this Robot-Human thing before they start start making them develop feelings and attachment or even real AI. You heard the guy, he said he does believe there will be a time when Robots and humans will be “indistinguishable” and I really think that is something we DON’T need! Especially with all the moral conflict it will cause, we have more important issues to solve before creating AI... ugh and not to mention it’s scary af
youtube
AI Moral Status
2019-11-29T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxi77H8bZM_7sbna7V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxybtOyXSUDS_ZNYJt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzuwvmi7JNh6WAZ8Ip4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwq7qD240Ox-w1m1Qx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFqZ75KyudWjpyikh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx1n7aEIZ5dcCso1CZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2w2ZgKGabgqxmygd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxOj53RTL1FSwd1yhl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzd3eaN1upuwG7E0-F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwqvOuqIZNdcZhdeeF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]