Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate the use of AI for all of the reasons states, but ALSO because of the envi…
ytc_UgwXk3Prt…
G
I think the only thing the skin is missing is the sub-surface scattering that co…
ytc_UgxYeenXL…
G
These LLMs and Robots have been trained on our IP, we need to demand lifetime ro…
ytc_Ugxf3eW5Z…
G
Where are the political parties banning AI? I see them winning the future electi…
ytc_Ugw6bpkbz…
G
1:25:50 "Democratically contest" That doesn't work in communist controlled count…
ytc_UgzUZCP7F…
G
@nimbusloud I didn't really have a point. I was mostly commenting to give enga…
ytr_UgxAlDIl6…
G
And not to mention, so many things happen when people actually do the art themse…
ytc_Ugwf3ZWCd…
G
Rather sad situation. I could kind of give a discount for AI-generation stuff fo…
ytc_Ugw3LIi9_…
Comment
I fail to see why many of these AI speculations assume that we will force a sentient machine to work. Sure, it will likely happen in some instances. But if we as human's spent centuries dehumanizing each other for labor, why would we even bother elevating robots to that level?
As to the "It'll happen before we realize" argument, I wager the machines will be set back long before they have the chance to develop consciousness. If they start making *any* decisions that don't fit their intended purpose, they'll be reprogrammed to remove that quirk.
Now, the luxury and pleasure industries. Those I concede may give rise to artificial intelligence. We already see dating sims and chatbots popping up with ever increasing detail. It may even be that marriage counselors will be the most equipped at dealing with AI. xD
youtube
AI Moral Status
2017-02-27T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg3xHoUtx6gWngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UggE8ZCLy_Y7-XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghT7BJ_Jkv_q3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghD0PHvZSddz3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghzdlAYEf702XgCoAEC","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi0wlK0xxTZ3XgCoAEC","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UggL_n6lQWteeXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggLBNtGHpEtHHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiX-KqMqEVNV3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgivRlFZ-T5UaXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]