Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ofc some scientists would create a robot with feelings "just for research/fun" b…
ytr_Ugi0V_EZK…
G
@lolalalia4119 non technical tldr - AI is lot like a lossy video compression alg…
ytr_UgwLlVjlq…
G
This is pretty much it, it's bordering almost on a grift. My company has been ta…
rdc_moxah3s
G
People need to WAKE UP!! You will HEAR about AI replacing workers in call center…
ytc_UgxRKlS_D…
G
Fun fact i got my entire art # got stolen by ai ... Just wow i even made an wate…
ytc_UgwF_mWNM…
G
I was once accused of using AI, which is interesting because I only do pen portr…
ytc_UgzM-pCsw…
G
Bingo. On the whole, AI summaries are a scourge but a stopped clock is right twi…
rdc_nuaoysm
G
This moment in time is our opportunity to break free from our enslavement and th…
ytc_UgzTNiF3I…
Comment
The answer is NO, if a human programmed a "robot" to have emotions, they would be superficial and projected, and would be just as ridiculed and cast out as the SJW trend that rose in the early 2000s and is now dying.
Robot's will get rights when AI chips LEARN from their surroundings, and can retain it, after a forced or mandatory shut down and/or reboot.
Programming =/= Sentience.
youtube
AI Moral Status
2017-02-24T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh0c4l23P6EYHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgizmdfK6BHeengCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgiS9-lmbu6FW3gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Uggg7_XeDnLEkXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjlRCoviv8l7XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgiAi7l2Sx79l3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiM-TwLKWJZ13gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UghE_QrjN0MWgHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi_n0NFADJiGngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Uggf753UlzgQ93gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]