Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@opop-b2p you sound like an ai bro yourself that never draw or invent something…
ytr_Ugx9pfHku…
G
AI stuff is pretty shitty. If you're really that good, you don't have anything t…
ytr_UgypRR_hv…
G
@WayneClark I do not know how demons can enter robot's but aye nothing is out of…
ytr_UgxYWRS5J…
G
It's so sad how looking through the comments, it's like not one person understan…
ytc_Ugy9KD_pw…
G
Hey there! It seems like you find the concept of AI models like Sophia quite int…
ytr_UgwT3i7xY…
G
My question is: Why do 99% of viewers of this video who comment, automatically a…
ytc_UgxzYq1cU…
G
ai will reduce us to very small number to keep us as pets. we are spread all ove…
ytc_UgzKg7Llv…
G
That’s exactly how to handle this. Through the no copyright no monetized model. …
ytc_UgzG1JBTO…
Comment
There are so many what ifs in this video making it a very philosophical question. And philosophy is in many cases open ended without a straight answer. The entire video is a prime example of how bounded our rationality is as the comparison is drawn with human history. But chances are these comparisons are void as robots are in fact an entire new species with their own characteristics and dynamics. It already starts with the assumption that a robot needs to have feelings in order to have rights which is pretty presumptuous. The lawn at Central Park doesn't have feelings but it does have rights.
youtube
AI Moral Status
2017-02-25T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghKoK55MKjPi3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjLk4dwj6E7c3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgifpkGSnco6Q3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh3oGA9UWKfbXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ughlf5QYg265ZngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugjui8lyYzSrvHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiuXt-nUv5jbngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggvBcByL6n803gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjWed3DMfpEnHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghAqkiQfyzw4HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]