Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Creativity is what makes us human because it is the only way our emotions can be…
ytc_UgxoT1nvE…
G
The most alarming aspect is that people debating AI and the dangers of AI laugh …
ytc_UgxKVh7qc…
G
I experienced the same thing when I tried to bring reality to the conversation a…
rdc_d1kve11
G
It can compute faster than us arguably now but what about creativity? I don’t s…
ytc_UgyJEsAKN…
G
I would expect the next Waymo evolution to be cars that are able to yell at each…
ytc_Ugw6pX23k…
G
I’ve spent 12 hrs on a single drawing, for someone to say I don’t have a right t…
ytc_UgwSGNXof…
G
I love AI... Cause i can give life to my thinkings without disregard my SoD2 pla…
ytc_UgwlZLNDa…
G
my chatgpt is labeled Gale from BG3 so I remind myself to be nice to it and that…
ytc_Ugxh5y7uK…
Comment
I don't think we're remotely close to having sentient robots and so calls to think about robot rights are very much premature and silly right now, but I think it's pretty problematic to use that to justify claims that robots as a class simply could never have rights at all. If we had a C3PO or a Johnny 5 in real life, it'd be very clear to anyone interacting with it that it's more of a person than a toaster is. An LLM certainly isn't a person in any sense and that is technology that is being used to oppress real humans so I get it, but it just seems weird to categorically state robots should not have rights. It's just weird conflating AI as it stands now with sentient robots. They're nothing alike.
youtube
2025-10-10T20:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyHQdLBGQnbG9XrN254AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2-U8V_1q-TWjUPq94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzc_kbYPP3J64STzy94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzEVCLRlTMLKUUwh214AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxYcdap3hipnwL8NOB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZya8GCYlGmIy1E-t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxt30yf0E4kLC9Kltp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzPLHzLfnfp8BFo8Vx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLoSrixqg5Oi_3bb54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNcoeJd3YSuEbyqYl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"resignation"}
]