Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Compassion and fairness... I love all sentient beings, that is why I have a hea…
ytc_UgwFUpJAI…
G
Alacnay The Great Well technically no.... but Zenyetta is in the video and she's…
ytr_Ugii75IQR…
G
Shelby - just Compare WAYMO $250,000 cars with STOCK Tesla model Y.
let that si…
ytc_Ugx2enphX…
G
We are not summoning the demons but he surely does to create an amazing marketin…
ytc_Ugy9yBdMm…
G
As someone who has studied the philosophy on consciousness between a great numbe…
ytc_UgwGNrh2q…
G
This assumes the algorithm is infallible and incorruptible. Humans are fallible …
rdc_mto5yl3
G
I was all in on this guy untill he started spewing woke dogshit like uwu putin b…
ytc_Ugy3KoKFT…
G
Why do I even have to acknowledge morality? I was under the impression any kind …
rdc_di3r4u7
Comment
The idea that robots will eventually get rights doubles as a way of dehumanizing Black people: equating actual machines with the historical struggles of people who were treated as machines and believed to be less than human--and still are, by many of the same people promoting these concepts. It's also crazy that people can more easily imagine robot rights than equal rights among humans.
A robot can't be a "slave" or deserve "rights" any more than a shoe or a saucepan can.
youtube
2025-09-17T11:1…
♥ 368
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxWVKMHttR5ZT595pd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwDeolbZsztKSHGOQx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwvs2p2UPEuZCX98fN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzhYzfqqTaUqrcVej54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwrDwkjqBvRr528k7R4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxDyWZfACN4QWlE2j14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy55HHIdO4VW6KJtBd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz5V759UR2eSKt_pMp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzCVBnuGoIp5T0OUFN4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxi4BlP3Pebc2EiN1l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]