Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Great accomplishments, Joy:
Joy Buolamwini is a poet of code on a mission to sh…
ytc_Uggxuaxkq…
G
Ahhhh, replacing to rude and mostly obnoxious people who are on the other end of…
ytc_UgwTtrNS2…
G
No, they are going to increase the amount of money they are already spending for…
rdc_dcwmw75
G
No ai shouldn't be stopped because humans have limits of inteligent or physica…
ytc_UgzJIJmMC…
G
been taking ubers for almost a decade and taking taxis my whole life, experience…
ytc_Ugx0IN8Eq…
G
The funny thing is that the people who created the ai claim that the ai uses art…
ytc_UgwkddZjs…
G
This isn't real, right. It can't be. They didn't actually hand a robot A machine…
ytc_Ugx2aZvq5…
G
I think we can all agree that AI art was a mistake. It shouldn't have been creat…
ytc_Ugy9iKt1r…
Comment
Why would you want to create a robot to be like humans? Robots would be perfect in everything you tell them to do. They wouldn't need to eat, drink, sleep, breathe, blink, have a family or anything like that. Why would they want that? Humans are according to our nature, naturally selfish and our free will make it possible for us to do what we want, whether good or bad, our bodies are not built to be indestructible and we are not born highly intelligent, therefore we are imperfect and capable of failing. So why create robots in our likeness? Just leave them be as they are. No point in them living among us, because it would be totally useless.
youtube
AI Moral Status
2016-12-13T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Uggrc8tNRdogingCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"disgust"},
{"id":"ytc_UggtiVitI-RijHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghzkJNOia_YlXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiAhqmX_fK3T3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggAGuj2jism43gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggcVHK1P5nvJ3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiWDiGIha5ZA3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggKQN2feY0P9ngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgjeHZ3etkbLp3gCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjJt9r-M0ktEXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]