Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Some say that "God made man in his image".
I reason that some humans believe tha…
ytc_UgzYAXWTZ…
G
There is hope: https://youtu.be/zeabrXV8zNE?si=sGOUt7Eh1gla_oJ2
Please share th…
ytc_UgzaKV0B5…
G
2:38
So the current justice system favors making a judgement first and cherry pi…
ytc_UgxeD7BMn…
G
...unless if i can prove that I didn't break the law do you promise not to tell …
ytc_UgzjRCaEm…
G
Ironically, the link posted itself contains a bunch of tracking arguments ("utm\…
rdc_jteqwkp
G
I see you've used sad emojis in your comment. If you have any concerns or feedba…
ytr_UgwmhBV6S…
G
Yes I agree, but in one breath he said this. In the next breath he's making AI r…
ytc_UgweovB5L…
G
anybody’s face can be taped onto anybody’s body through Photoshop and CGI editin…
ytc_UgzMWBZp-…
Comment
I do agree that the present ethics question primarily lies with how AI impacts humans, but the idea that robots will never be deserving of rights does not sit well with me. At the rate we are moving, we are likely to have sentient robots in my lifetime. Something that can speak my language, may be more intelligent than me, has a sentient mind and can be exploited should be granted some consideration. It is good that we consider these questions now so that we know how best to respond later. Yes, humans will always be the priority in this equation, but that doesn’t necessitate excluding robots from the ethics question. “Never” is not a good word to apply to ethical queries.
youtube
2025-09-17T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxBj3x3fexkqMBB7rF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw6xrrd4pQjuRRKUaJ4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwPjOF4pCjxDJs9MbJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugxt0rcPiqk3meRbOU14AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxRsXTuKzpkxRReh-d4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_Ugy7xkEgfk0Y5S_Ecbh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxKGWBEHrm3URaGCXl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw2MkU3h-SWESpz40d4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxQDL_rogUePHjHnc94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxf3PGO1q2CBhyQ5GB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"}
]