Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I see it smart as a backup like hey you forgot a screw. But also yeah fucking di…
rdc_nl2hq2v
G
Even if something someone makes isn’t original, it’s far from an excuse for anyo…
ytr_UgyvR3crJ…
G
Also, let's not forget that references are the building blocks of any project.
I…
ytr_UgzG7NXtw…
G
Got an ai commercial when I clicked on this sam with a skibidi toilet in gotcha …
ytc_UgzBdaFG1…
G
You sure don't seem to be aware at all of them. Self driving cars have clocked h…
ytc_UgwazMWN4…
G
I feel the same way. It's soul crushing. I can't imagine a day I haven't at leas…
ytr_UgytMgcaF…
G
WELL??? IF HISTORY_ as we know it is one big lie.!!!???? And HISTORY is just his…
ytc_UgwNCc_i9…
G
AI is a valid tool wich arguably is not that different to any tool replacing an …
ytc_Ugzu9XVRc…
Comment
But what if its programming tells it to program other robots in the best way it can? Most code for machines today are actually written by other programs which are written by programmers. If the program tells it to program the machine for the machine to work for as long as it can, what the program sees it necessary to program pain and pleasure so that the robot would perform for as long as it could and sense its environment to steer itself away from harm? Then it would experience pain and pleasure. Would it be concious?
It does what its programming tells it to. What if we're not the ones programming it?
youtube
AI Moral Status
2017-02-23T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgiNKzzbbtFetXgCoAEC.8PL4_BkYKhH8PL7-06PJyx","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgiNKzzbbtFetXgCoAEC.8PL4_BkYKhH8PLC4xIffA-","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ughcmty2iMMsFHgCoAEC.8PL3q3krKEk8PL7m7PJYg9","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UggRPiq5dwY9P3gCoAEC.8PL2mhsalPI8PLABX7lpAr","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgiOaXexrKY_SXgCoAEC.8PL12S-v8fO8PL5FFgDJRd","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugjrf2Y85YKyVngCoAEC.8PL-Q9tvEeN8PL4sqa1Ty2","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugjrf2Y85YKyVngCoAEC.8PL-Q9tvEeN8PL7cIsRhFh","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugjrf2Y85YKyVngCoAEC.8PL-Q9tvEeN8PL92CZfAg3","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgjIvUEfE5r063gCoAEC.8PKz6lrvdh48PLB9sOe8Te","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UggWa3AcKd7cHXgCoAEC.8PKyWZiGd_R8PL60F186iM","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]