Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
See why is all this even being called Ai? Shouldn’t it be called Vi (Virtual int…
ytc_UgxKNUJ1L…
G
If the time comes when AI can actually think and learn on its own (it can not at…
ytc_UgzVjHLta…
G
24 minutes of thinking 95% customers and consumers not giving a f…. If an art p…
ytc_UgweJ3MYA…
G
@zandrrlife the default for the top AI companies is increasingly large, expensiv…
ytr_Ugz-xaGPm…
G
The reality of the situation is that you can't tell when AI has been a part of t…
ytc_Ugz-3R8l5…
G
See and thats just it that neurological equivalent process is unknown in humans …
ytc_UgxO-gtGT…
G
This video is so full of bad takes. It assumes that AI will turn into competent …
ytc_Ugyqjsqfu…
G
The meta AI artificial intelligence program is available for free... They are no…
ytr_Ugxgaky-l…
Comment
No. A non-living THING does not deserve rights. Robots do not have a symbiotic relationship with humans, animals, and plants and therefore do not deserve the protections of beings that are alive. We think and feel and provide intrinsic and external benefits to one another. Robots are the creations of human beings and are programmed to complete tasks. Computers/robots/machines process. They do not think or feel. They mimic. They do not originate. People, animals, and plants are in a fully symbiotic relationship with one another. They way we operate (when in harmony with nature) provides benefits to each other. I would feel horrible about harming an innocent person, harming an animal who wasn’t trying to attack me, or burning down a forest. I would not feel anything if I dropped a robot off a cliff...and that robot wouldn’t feel anything either because it is not alive.
youtube
AI Moral Status
2020-07-08T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugym08kqdNxUkx2-10h4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw_FVBepk0HmLi4XIl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-doPVTFSsH45POn14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy_U1c8hw1dbnQUZP54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwSfROM85Ux7Gs0y7F4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOEG9iNvpbccy3GwZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzX6dYAXDanVaf0hUh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRMrZoIrY08Mv59Ht4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKIra1BpAyZvQy4YZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwnY8UO0f3NEOf4jd94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"}
]