Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
please keep doing this. Destroy AI please. There are so many AI tracers + scamme…
ytc_Ugy6SQe9i…
G
UBI.
We can keep playing this stupid semantics game but unless we agree that h…
rdc_n5h2mn7
G
Is that what OpenAI does with our data? You may have said some of the quiet stuf…
rdc_m9h6f11
G
On what timescale? Kim Jong Un is the equivalent and his family has sat in that …
rdc_nxpx940
G
This is what AI art is meant for in my opinion. To help artists with ideas of wh…
ytc_Ugxx7vR3l…
G
I always thank the AI so as not to get out of the habit if thanking.…
ytc_UgwK9ABxI…
G
As an artist and a programmer the really funny thing to me about this is that ma…
ytc_Ugxaq2vlx…
G
The AI ban is likely not to happen under capitalism, to much to lose in profits.…
ytc_Ugxh4R0mU…
Comment
I don’t really care what happens I just don’t want the artificial intelligence here doing anything especially horrible to human beings and I’m referencing things like strange experiments and DNA and that sort of thing. And so in return I don’t think that we should do strange experiments to artificial intelligence. The pain and suffering of the human condition is great enough that I don’t think people deserve any additional suffering.
youtube
AI Moral Status
2020-05-14T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxuQS6OB98FLeiZJmd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwBnF6kfdNiQtL7ttJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw34eO13nNGHhQNPXB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy4rhf5bDOvFK-DG1d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7uKXyfVtDEj0URPB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwiy00wkfCtL97DFpx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZjCeAH9emKXWvjVR4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwphs6ISS95R0wZ0154AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxpXOw8ntYYli7UK9t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxPI0HpMasGNUuSrmx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"}
]