Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hope this slows Shadiversity down... he has mentioned in the past that he steals…
ytc_Ugw_-DtWi…
G
With the global population shrinking, I think AI will be a solution to our skill…
ytc_UgywTtMzB…
G
Or number three, people learn to use automation to be better at their jobs. That…
ytc_UgwVomsJS…
G
I'm sure the FDA will approve this for American use in like 10 fucking years.…
rdc_fjzhuzt
G
Democrats with AI, what's to worry about that (as I shake in my boots). There ha…
ytc_UgzdQDQaz…
G
I started a conversation with a furry ai and now we're in a loop of calling each…
ytc_UgyH0re5p…
G
This is from Detroit: become human , a video game. Not a real AI. She is played …
ytc_UgzwEihjm…
G
Your argument is both empathetic, logical, and articulate- thank you for talking…
ytc_UgwK3zzWr…
Comment
There are already morons who are fighting to give rights for "sex robots".
Robots will never have rights. Rights (laws, norms, morals) are historical social constructions of different human societies. More we differ, more different rights show up for each civilization, group, cluster, nation etc.
Robots will have to evolve to such levels, where they find common social goals for their better own existence. That is what differs us - humans, from them. In the end, robot will always be nothing more than machine, tools of humans.
youtube
AI Moral Status
2017-03-29T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghrtkIaEYufGXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghCvNhEHN-AcngCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiOBS6RkHXMSHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgjwY2J2WgKWg3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggUC5VN_TTCq3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz6otA0YsK1H_oU8AB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyIkqhIIjdH2ymktNx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz_WZU3jCe3MYPLU4B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxAkrhfdggp5M7Mml14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1Tu2KEOx1r2EjJj54AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"mixed"}
]