Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is more than technology. It's actually Human Computer Interaction or Human Ma…
ytc_Ugxui5axf…
G
I don't need my job - I just need money. Give me a basic social income - and let…
ytc_UgxAWfvYH…
G
The AI bubble has been burst open. Not surprised ngl, only shocked it didn't hap…
ytc_UgxmV_tu2…
G
The 3 laws of robotics 1. a robot must never harm a human. 2. a robot must never…
ytc_UgwUtPHHo…
G
Sorry, Bernie, but this is a losing case. Throughout history, progress has alway…
ytc_Ugy904zzH…
G
Do you hate the fact that AI has exposed how outdated and borken the education s…
ytr_UgwPAKbYy…
G
🤣 AI is not an issue! It does not even come close to biological capabilities tha…
ytc_UgwUHSMon…
G
Seems like some people in poorer regions will be paid a few dollars for eye scan…
rdc_ohzo19j
Comment
I would definitely let my toaster have rights, as long as it let me toast stuff in it!
But to be serious, the problem really lies in ownership. If I were to build a self-aware robot, does that mean I technically own it? Or is it like, say, having a child, in which they are their own person with their own rights, and you are just taking care of them until they have matured enough to live on their own. I mean, what would companies that make self aware AI do? They created the object for retail purposes, do they own them and get to buy and sell them as they please? Personally, I feel that if someone creates a self-aware AI, the robot chooses what happens to them. And if you want a slave, don't make your robot self-aware, dummy!
youtube
AI Moral Status
2017-02-24T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugj9IyZDvRhiXXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghWo3usIOacZHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiqcS6GbvZn33gCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugg8tFgIyqOuHngCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugjx1FPDgjaEH3gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgiFFRZ2iutea3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghDmkqr0-MViHgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugi-LyhtQAx1hHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiYyXt_y--VengCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiQ1uvFGvyUl3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]