Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
dude, it's an experimental conversation with ai, he has dozens of videos with re…
ytr_Ugxm6SAfb…
G
Except it's not participating in the activity even if they're nonsense about Art…
ytc_UgybeYQk9…
G
I wonder if in that case what if there was also another self driving car around …
ytc_UgiCg24wo…
G
The interviewer is not too bright concerning the questions and her interactions …
ytc_UgzChEFiC…
G
People using these remix / munging / mash up machines, really don't understand, …
ytc_UgwIPqdn7…
G
I enjoy AICarma's weekly insights; they help me stay ahead of industry trends an…
ytc_UgybqNxc9…
G
I think we will find that gen-ai will go away because it isn't profitable is mis…
ytc_UgygPcn8m…
G
Text based AI are better than Generative AI imo
I’m not even sure if there’s a d…
ytc_UgxPLrIfr…
Comment
Rights to any organism are based on the necessity to keep it from dying or feeling pain. Human rights for example are the rights to have food, water, and shelter, that's it. Animal rights are there to protect their habitat so they don't go extinct. So here's the question, do robots have the same needs as living things? No. A robot does not feel, has no consciousness beyond what it's programmed to have, and if you shot it, stabbed it, shocked it, drowned it, it wouldn't suffer. Hell, even shelter isn't a necessity due to how robots can function indoors and outdoors without suffering from the elements (yes, they will rust and get damaged over time, but getting damaged is not the same as suffering since again, to suffer means you need to have the ability to feel pain). The only thing robots really require is a power source to keep them functional and routine checkups and repairs to keep them function, and those I do agree should be provided.
youtube
AI Moral Status
2025-03-25T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxhW1LrBWlHVtsTFbp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyuBvq-R9-yk2S2_jV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzRLjdKhsEnkF2THQV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugws86BqMaqEX7wJf7p4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyGlF546vU4c85nBbV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw4z_uQH40GaCwo7BJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz6whgLO8CrN7-FczF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyVgE6c6ndUOVYtgxx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxPzbXNhJy49pwX1v54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzNJY2RPX9oiuzv34h4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]