Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
neuralink is the equalizer to AI? Sounds like AI would just want to hack neurali…
ytc_Ugy51N4mY…
G
So true. I think people are getting way too excited about autonomous trucks. The…
ytr_UgyI5enBq…
G
Notice how poeple like bill gates always scream about mitigating risks of techno…
ytc_UgwrdAsgY…
G
If someone is open that an image is AI and they are not an artist then I am okay…
ytc_UgyGJLtp8…
G
I think this is why a lot of the AI stuff sounds the same now. Especially when …
ytr_UgzlAXt0s…
G
if u could copy right Ai art then wouldn't it belong to the company who created…
ytc_UgxXFAsIk…
G
I got all correct but I made a small mistake by saying 2 is real but it looked a…
ytc_Ugylw1lBl…
G
It’s fake the robot is a real person this is all edited. Very well edited tho 😂…
ytc_UgyP3TnC3…
Comment
This guy with hut ...scares me allot....his creating the destruction of human race and the future generations of human life...
He is so focused on creating artificial intelligence that he oversee the potential danger of his creation to human and to the world....i support the creation of robots to help the human on there every day life.....but not at this level...that his making this robots much smarter than ordenarry human being....it's too dangerous.
youtube
AI Moral Status
2021-10-24T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyv9tkouzibwPFFYod4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwLRB1fFPW11xBksxN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqMTaVBSo6NzX3GtZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxEzaRyXh-r2xq8l754AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUYB9Ot7WZIXHVXE94AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxdiYGBB4LZxlIfqyl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKrTUgwt8ANSFJJll4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzu0OsfRhpyUa1D6Zt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycW7CHTHkYp08THpV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwhcqD_feExmWS0d0J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}
]