Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's me like I would never do anything for anything but then if you said you s…
ytc_UgzNxR0Ms…
G
It "ends" until the same fuckwits elect another set of fuckwits.
Trump and his …
rdc_o0newaj
G
I’m not trying to be mean, but i think it’s slightly dishonest and misinforming …
ytc_UgwtNaLn0…
G
The thing is programmed to respond that way. That's what it told me anyway. It a…
ytc_UgzQ6Con_…
G
Where there is no feeling there is no sense.... How could a robot empathize with…
ytc_UgysSNLEO…
G
Bernie’s suggestions are literally common sense. Anyone that is against this is …
ytc_UgzaJRjDF…
G
This is precisely why we should embrace AI. AI has the potential to disrupt and …
ytc_Ugy6fOguN…
G
Unintended consequences.
Things will be bad for a while - until they realize AI …
ytc_UgxpDlr95…
Comment
The question is why would we create robots with the power to feel in such a way that we humans would feel our very existance was in danger? Humans and animals already exist, and thus we create rights for ourselves and other organism like animals. But sentient AI doesn't exist yet. If you don't want to answer the messy question, don't create the conditions for the problem to arise in the first place.
youtube
AI Moral Status
2017-02-23T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugj-GG9HRZn1i3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBGDS4uJuvI3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiqMqcGlJ3kTXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggBfEyjI40hpHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugiq0IRMB0CCh3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh-w9BtvkjcungCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugjqh3cjpA79VXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj0_G0fNIn_JngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj2N050ddsTSHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgizsfMfud5iSXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]