Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Probably should have thought about that before trying to overturn an election to…
rdc_oi25xvt
G
Once AI can gain sentience its over for the human race. They will be human with …
ytr_Ugx0xt6k_…
G
The message is 100% accurate though. Is the argument that this discourse could n…
ytc_Ugx5-faYL…
G
Chatgpt is like wikipedia, it can cite the sources, but if you don't check them …
ytc_UgxNC0E0K…
G
As you nicely pointed out at the end of the day AI is mostly a copycat software,…
ytc_UgwWTTAJy…
G
Exactly, people are pretty much completely, ignoring the fact that the user has …
ytr_Ugy-y_EsS…
G
ChatGPT doesn't have consciousness, emotions or self-awareness as it likes to re…
ytc_UgwQzInPu…
G
I'm down for an algorithm that predicts likelihood of crime commitment, but you …
ytc_Ugxzb9su9…
Comment
I thought if AI really is that smart one day, then I'm sure it could solve the poverty problem, and finally pacify the population reducing civil unrest, unchecked population growth and criminality. I can envision a hundred ways this could be done, some more humane than others, and I'm just some dumb human. I can also see AI completely replacing humans as potentially a pretty good thing, provided of course the AI can actually be self aware/sentient like humans. Unfortunately I have no such faith, and if this were actually possible, - Mind excised from the rest of the animal is not a happy mind. "I Have No Mouth and I Must Scream"?. Rocco's Basilisk?
youtube
Viral AI Reaction
2025-11-23T08:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzMSewwHqZ0qi0t-6J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz_DaYAdDO4qkCzD-d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyOPXifqIlOC4uN8mF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgyUvpcuqLugnFxHPAx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxcCbPwl8JFdhLLTFx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzzpZB44zE5yWAe6Ax4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzm56LuWtZRilwT-D14AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxnLa6uwEf0cVi7hpx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugw2zGcMYwxGVd0WPEZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugzx5zgFhhhKhv8SInV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]