Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Judging by all the automation they have and how well it works. They’re just hiri…
ytr_UgwONjV2J…
G
Software engineer hai re baba aur Ai engineer ko experience aur masters karna pa…
ytr_UgyY_ctT6…
G
We're glad to hear that the conversation made you rethink things! The interplay …
ytr_Ugx4F5czH…
G
As soon as anybody mentions the UN or international legislation, you know we are…
ytc_UgxH8bET-…
G
Lol, they always say auto pilot is not self driving. The mistakes have always be…
ytc_UgxabiCvq…
G
I find the superficial and naive part of this video to be its comparison between…
ytc_UgyxiFJOl…
G
AI can't set up lights and move mics and stuff yet. They are still necessary un…
ytr_Ugw0kbRaJ…
G
AI IS EVIL!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! as immoral people created it, the…
ytc_UgwmP0hJO…
Comment
This video assumes that robot intelligence will develop in a manner similar to humans. It's possible that while sentient, they could turn out completely alien minded in comparison to us. Robots could value and fear things that are at complete odds to our instincts, or are incomprehensible to us. Demanding that robots have rights similar to humans may actually be detrimental to robots. Would a being who never tires worry about working hours, or would a being that can upload its consciousness to the internet perceive death of the body the same way we do? If robots do receive rights in the future those rights would have to be tailored to how robots function and not just expect human rights to be a fit all solution.
youtube
AI Moral Status
2017-02-24T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugh4xkVi4MfetHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggLQKwVGkmGH3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggAnOn8fXWe_XgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgglPt9FSMOxZHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgijavkW4w4I8HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugg4Od1C-VYHqHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggUKbdXKJJrMngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggEZyTQU4SE3ngCoAEC","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugiata-MDSuPkHgCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UghrBLrWi9JmwHgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}
]