Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"oH nO It SEeems THAt i cAn'T wORk aNyMorE" ahh robot didn't wanna be the change…
ytc_UgyFowGvk…
G
Nice imagining that and hate to destroy it, but ChatGPT does not store records o…
ytc_UgylJ4myO…
G
I think an important point this video is missing is the origin of creation. Arti…
ytc_UghNAI8Iy…
G
Yes they are/have. My girlfriend works in corporate for one the largest telecomm…
ytc_Ugw45JAw7…
G
Generative AI is definitely not conscious and definitely doesn't feel anything. …
ytc_Ugx0mznNr…
G
You make gallium guns, and regulate ai to only be made out of aluminum in and ou…
ytc_UgzclRS0-…
G
omg please place these robots with pedos so that they leave children alone and a…
ytc_UgzzG84Ak…
G
Spoken like someone whose no longer on the ground level. AI is hype by the marke…
ytc_UgwYd2sIj…
Comment
I tried this the other day and ChatGPT actually gave me a lot of interesting answers. I asked about the nephilim and what happen to them. Said that they went to war with angels and the angels used plasma weapons. And I asked if plasma weapons are used today and it says yes and it named many countries that use them.
Then , just out of curiosity , I did the prompt again to get more information. But suddenly it’s not giving me any answers. All the same questions that I initially asked are now “myth” or “unknown”. Kinda weird.
youtube
AI Moral Status
2025-07-22T11:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyMYbdx6zgF2O9OknZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzfv6IQLZjWdSLLlxR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1r_mmYvPeAOD5ujd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxvQMpC2S1quZPAQjZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyIN5I1WKjkXzJOwjF4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw9Sx9tJDm1DdW61lR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyFeqImatno2KHjmXN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw723iriE8VggdL40l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzpJ5IyNJvNsUnaZQR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzz7dwef_Ls0wAkC6h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}
]