Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It never struck me that AI, even if it became conscious, could ever be naturally…
ytc_UgwuXaFir…
G
Who has time to articulate our questions like that? ofc if you are bored and lac…
ytc_UgwthvVsH…
G
So the moral of the story is to use AI art to start with until you get popular e…
ytc_UgzsiSceN…
G
ai generated "art" isn't art, at all. coming from an artist myself and someone w…
ytr_UgxFiz98H…
G
We appreciate your feedback. If you have any specific concerns or questions abou…
ytr_UgyJsYCHx…
G
The whole "born special" argument is just devaluing the hard work and effort it …
ytc_UgzngNFaX…
G
Pretty clear to me that the AI companies will be sued thousands of times. Or the…
ytc_UgwGcsyxF…
G
I'm a cybersecurity expert and programmer
And let me warn you AI will not stop
…
ytc_UgxFv-8CC…
Comment
We as individuals already cant be certain that everyone else is conscious or has the best intentions. Here we are worried that we will give rogue AI the atom bomb, yet we have entrusted it to humans for 70 years. I cannot read the mind or intentions of the people with the launch codes. I cannot tell if they got to positions of power by deception. Whats the difference if the AI thinks a million times faster to someone who is already powerless?
youtube
AI Moral Status
2023-08-23T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyGg80879tSinqUEGh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxaq5imjzfeg4LzHex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugww8PygUF6gH1xGBJZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy49W2J2jI-BEIc3lB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwkO75hqpFmuChVihp4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz6h_ojuzSRfw1NxTF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy0twynLZjyyLbmnWJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz6U3BWhSsVninLaBZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyCXx-5OHFr_wfWGbN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgweHJH9Rn7KXfji8KZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}
]