Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
self driving has hundreds of years of driving experience but still only has the …
ytc_UgznjBamx…
G
The AI that powered Alexa wasn't really a chat bot like ChatGPT, Gemini, Claude,…
ytr_Ugxyn_wfU…
G
Honey, I will never meet you in person however PLEASE know that with this interv…
ytc_UgzdhQp8F…
G
The video shows how an AI can generate realistic images based on text prompts. T…
ytc_UgwTTtOT5…
G
I honestly prefer the AI follows whatever goal it comes up with and kills us all…
ytc_Ugwn0dhZS…
G
I use ai, but only when I want to screw around and make some dumb shit. I know i…
ytc_Ugzuq57FI…
G
I completed a research paper that I was genuinely excited to get feedback on as …
ytc_UgzkOHHO5…
G
I just worry for the younger generation of today. What they see as the ‘normal’ …
ytc_Ugz0z1YQh…
Comment
"The question of robots demanding their own rights" is only anwserable in one logical way: Give it to them of course!
No being should suffer. All are equal. We are all robots. Humans, machine, animals. You can butcher thousands of chicken, it will not make the fried chicken feel less tastfully. But be aware that you are mistreating a automaton. So stop cruel acts and make our current robots' life better. Robot rights have a natural successor, animal rights-.
youtube
AI Moral Status
2018-12-11T16:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwxbGDsqCiSNuQhRfR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwVgsUMRlcixxZfw8p4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx631C12qaWO8ZV5vN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzT9okUcSlUw_n7VSd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyipJWvs2wfjFQCHOd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz5JhTK_u6mxG2AHjB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOauXWT3WGLwPcCXV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYcPwVex6HZFsb2kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUSa2V7YjIYKavGqN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxbAV0LZJaLxgRItnl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]