Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Everyone who says AI won't take any jobs are just deluding themselves in false h…
ytc_UgxSMCncq…
G
saw someone saying something similar about having aphantasia and suggesting that…
ytc_Ugw5QCOrW…
G
Once AI becomes A adult then an old men This world will be A Robot world… humans…
ytc_Ugz_oiTVb…
G
Between AI and TikTok, i'm pretty sure the latter is spreading *even more* bad a…
ytr_UgwF9Tvrk…
G
😂😂😂 we don't care once your robots are in place to work for your business. I as …
ytc_UgwEKWnjz…
G
how these laws would be crafted, they need to be very, very specific. "Non-conse…
ytc_UgyTv8nYB…
G
Big money is behind this AI technology. The industry will change forever, like a…
ytc_UgyNTXYrL…
G
I respect what the guy is saying. Problem is, we haven’t automated coal mining,…
ytc_UgxhN2IxK…
Comment
My main problem with giving sentient AI rights is that they'd probably be programmed to serve vested interests by their makers. I don't want to have to trust people who could be pre-built to serve corporations or governments.
"My girlfriend broke up with me, do you have any advice?"
"Well, you should go home, buys some Bluebell™ icedcream, and then relax watching "Don't question your authority"™. It has Chris Pratt in it!"
"Oo I love that guy. Thanks."
Then again, humans are also pre-programmed with vested interests, but that's mainly just for passing genes and preserving the species.
youtube
AI Moral Status
2017-02-24T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugj9IyZDvRhiXXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghWo3usIOacZHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiqcS6GbvZn33gCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugg8tFgIyqOuHngCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugjx1FPDgjaEH3gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgiFFRZ2iutea3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghDmkqr0-MViHgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugi-LyhtQAx1hHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiYyXt_y--VengCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiQ1uvFGvyUl3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]