Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Technology is unstoppable, and pretending otherwise is foolish. The greatest dan…
ytc_UgzBSKJoq…
G
AGI will never happen. AI will do amazing things, yes, but a computer will neve…
ytc_Ugyq9BHmp…
G
Or is the problem that AI is completely unbiased and doesn’t take historical inj…
ytc_UgzttizfC…
G
@Itachi0609 I think most of us know, I'd be surprised if anyone didn't have at l…
ytr_Ugy6sqhhq…
G
I'm going to have a Boston dynamics robot compete in the Boston Marathon for me.…
ytc_UgybkCH2L…
G
The AI is not at fault, it is the parents.. and as for the 14 yr old girl..she c…
ytc_UgwKYd7D4…
G
More than America? They literally copy everything from America, from Military Te…
ytr_Ugylzk5U-…
G
That should be the end game. AI replace all human workers and we can all just ch…
rdc_ju1lib4
Comment
A robot can only do and "think" what you program it to. A robot can only _learn_ if you program it to. Otherwise they are static, they can't be personified no matter how much we try. If we ever create sentient robots, it will have been either because we wanted to, or because of a serious oversight on the part of creators. I hope that sentient AIs are addressed by governments once they're more relevant. It would be very dangerous to have a machine with "free will," especially when the greatest threat to any sentient AI is humanity.
youtube
AI Moral Status
2017-02-24T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgguZkakQ-aSIHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiVYzcqK51muHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjAwYV7bNsWrngCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgiM2ON3rg2HuXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjoNG0qXemj1HgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggnlcXdFfJjMngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UggQEebRU0W_WngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjFW1C80PQDVngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgijAzfQFlDQrngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj3wix4Hs8P1ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]