Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I guess Companies Now experimenting with AI that can we replace Developers in ge…
ytc_UgygwQ8ZG…
G
My ex from 20 years ago, she had been banned from every convention a 12-hour dri…
ytc_UgxS0oyUt…
G
>Did your grandmother fall for all those outrageously fake images too?
In m…
rdc_liy1gt8
G
@elisabethhowse yes you are so right^ this is also why google isn't allowing the…
ytr_UgwXEv5zJ…
G
Here is an idea the Government should make companies pay more for using AI and p…
ytc_UgxRFQpl9…
G
> I don't think we can meaningfully think about our future economy in terms o…
rdc_dcrivrx
G
OMG IT REPLACES THE AVERAGE WORKER???? WHAT ARE WE GOING TO DO!!!!!!!!!!!!!!! OU…
ytr_UgzgrL8mG…
G
Imagine hating technology.
AI art is different, not inferior.
Artists will alw…
ytc_UgzNa5Atr…
Comment
Is programming a robot to feel pain not, in essence, the same brand of cruelty as inflicting pain on an animal?
Perhaps it's worse? To give something without a sense of pain, that sense, we not only inflict upon them the pain of our own treatment but suddenly we inflict upon them all the pain they could ever experience at the hands of the world at large.
Are we not as a species interested in removing, or lessening our own sense of pain? Why would we then commit the exact reverse of this on an 'entity' that hasn't that sense itself?
youtube
AI Moral Status
2017-05-22T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UghFzdPE96-vgXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh85duhMW553XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UghuQ76Mtq7bmXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggbfSnvIR1GcXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgiMadlbSIBUj3gCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgippfQcZ5eF2XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ughq7T7pcmvSuHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UggYnPBXwk_QsHgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ughho5exH_I7x3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggduMjarUQUYHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]