Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You miss the point then. Just because the AI art isn’t a precise Rutkowski doesn…
ytr_Ugyw5JDOl…
G
AI and advance robotics will. And the billionaires don’t care about the human l…
ytc_UgzhEJEsp…
G
The world isn't ready for AI yet! For one reason, we would need world peace, and…
ytc_Ugw9ADs5M…
G
Pleaee stop speading this divisive BS.
Does anyone remener when the chinese made…
ytc_UgwUDez-H…
G
If it gets that bad, the Agarthans from Neuschwabenland will team up with the gr…
ytc_UgxprDs5R…
G
Here’s my thoughts I feel like everyone gets one robot and you train it to do wh…
ytc_UgyB1dRCQ…
G
I realized I wrote a poem about my struggle to find metal help that mentions AI,…
ytc_UgywPmMu4…
G
Ai steals from other people’s art and mashes it together. It has no “imagination…
ytr_UgxeNjvvK…
Comment
Honestly, I feel like if (when) robots gain sentience, if they start off as appliances people will be more likely to enjoy their company than that of other humans; if something exists to serve you, you'll have more of an attachment to it and people who don't will seem less and less pleasant over time. We already have problems with people who think others exist only for their benefit, but if sentient AI proliferates it'll most likely outnumber humans so quickly we'll essentially all become that guy. You can just copy and paste code, after all.
youtube
AI Moral Status
2017-07-31T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugj3khvLILefu3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjkvhPCQfER1XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggkkBXOQTM7nngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UghQgFsiOnvSDngCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgiJo7bnF0HeVXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgidMGiSpVopw3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg7qlWvQgN3N3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiSbIvA5BJ4O3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UghV2ZWqZTg1QHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggUB5a8zOw5mngCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"approval"}
]