Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If any of my ships on Character AI get leaked then that’s an equivalent to death…
ytc_Ugyx28J--…
G
For me AI art is just a tool to give me a clearer vision of what i wanna draw, i…
ytc_UgwTbD_41…
G
If humanity and AI cannot coexist in a way where human needs go unmet, humanity …
ytc_UgxkJFvIP…
G
Apparently, neither of these men had a niece or nephew, or intern or law clerk t…
ytc_UgzgF25fK…
G
Bro doesn't know a thing about AI Grok Gemini and Copilot couldn't help me fix m…
ytc_UgydDColM…
G
The "robot dog" in the image is a Spot the German Bundeswehr purchased, it's *ma…
rdc_ku8cj7p
G
lol Marketing is the only thing as sleazy as AI SLOP. Not doing as whataboutism,…
ytc_UgwuHHAz1…
G
We need to apply the Steve philosophy to future A.I. and create safe A.I. with l…
ytr_UgyTTVdBY…
Comment
- “You’re not made of meat, you’re made of electronics” what?!
- “Humans are not conscious”
- “Humans are not the most ethical creatures”
- “in 20 years robots will be able to do human jobs”
-“ Robots are the best”
The guy robot is terrifying.
I don’t care what anyone says I think robots are the most dangerous thing that has ever been created.
They already put themselves at the level of humans. Where are we going? Are they planning on replacing people with robots in job? What are we gonna do?
youtube
AI Moral Status
2019-12-06T23:0…
♥ 106
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgypcnlJCwcPYFjUgDZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyLUegXaOLgcyHUTYx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyp0esLQH4zTNeYfg94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKmRb3-oNR1VG5U6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyt91QW5r-t_5GWanJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwuyple0aG0WwTUucx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyog30MvdPwRVRGEPF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgytWyEYCEEZz2J-Y194AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzWDNDfoC8XbO2BdbR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxVmwfNbnbKrHVZ1yd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]