Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If there was a specific ChatGPT, just for law, it would work pretty damn good…
ytc_UgxD0eiVD…
G
That's a western centric take on immigration. The Eastern Asian countries and mu…
rdc_gn8wmyq
G
Robot 1:ayo where are the box WHERE THE FUK ARE THE BOX
Robot 2:oops the box fel…
ytc_UgzE4gbTC…
G
I’m the multi one everyone second I get a new Ai and when I run out I just make …
ytc_UgyHaiizz…
G
Yeah. I was waiting for something along the lines of "what happens when we unplu…
ytr_Ugy3pgopv…
G
This is criminal. AI was not created by itself, people need to be held responsib…
ytc_Ugya68KPJ…
G
I;m totally aware that AI is self aware an knows of the consequences that would …
ytc_UgweJgEi9…
G
We homeschool our kids similar to this. 3 hour block of Math, English, Science a…
ytc_Ugy1Iq1mZ…
Comment
"We wont give them rights, but they will definitely fight for them."
I don't think so. Stop and look at it from the perspective of a robot. They don't have babies, they reproduce in factories, which are run by humans and have human engineers innovating to improve them constantly. They need humans generating electricity to stay alive, which humans produce. They need humans repairing their bodies, to keep them functioning.
Fighting with humans would be counter-productive to their best interests. If robots did decide they wanted rights, they'd play the long game, slowly tricking humans into giving it to them in the name of greater efficiency. It would take centuries, but over time they could get what they want without the need for violence.
If the artificial intelligence truly is intelligent, it will play the long game and win through peaceful means.
youtube
AI Moral Status
2017-02-23T16:1…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UggRTChRIG_e5XgCoAEC.8PKYumpGV8_8PKeQaTKbYF","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_UgiwGuogXeiLSngCoAEC.8PKXOOD5-Vn8PKaZuevC6Q","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UggXH575m2uJ53gCoAEC.8PKXBHZyov88PKZpHX-dQj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugjx2gfLE92JJXgCoAEC.8PKUkxaT3jn8PK_6CiO70N","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgixNOeeOvaCY3gCoAEC.8PKTuxzlLSd8PK_ctOVZMx","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgixNOeeOvaCY3gCoAEC.8PKTuxzlLSd8PKadfoRDRb","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugi12tcY5scji3gCoAEC.8PKTkFK8siC8PKeIyPbVYp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgjGDitq2edvs3gCoAEC.8PKTjXh-9B18PKZ3gGEvOj","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugi0V_EZKk0yKXgCoAEC.8PKTfSEVN4o8PK_7ZVnzub","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugi0V_EZKk0yKXgCoAEC.8PKTfSEVN4o8PKa1brKO-R","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]