Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
if you truly are a fan of ghibli and you respect Miyazaki’s work, you would also…
ytr_UgxjN69ql…
G
It was clear from the beginning that this would happen one day. People are stupi…
ytc_UgyD_ttgr…
G
ok, you proved it. stop being cruel to the AI, my dude. a human would tell you t…
ytc_Ugwqsce58…
G
Ai artists say it's a tool, it can be creative, and all they do is directly copy…
ytc_UgwuA9QBf…
G
If I had a button to turn off the Internet and ai I would. We are literally run…
ytc_UgyHi2EB1…
G
I’ve seen a lot of very naive Redditors who are under the impression that AI wil…
rdc_nw85w73
G
The whole reason as to the narrative implementing AI to to the world to recreate…
ytc_Ugyc14pbJ…
G
This video really changed my perspective! With OSVue, I see AI as a tool to elev…
ytc_UgzSa743X…
Comment
If and when we create AI (assuming that we can communicate with it) we could ask it what it thinks. Everyone saying no or yes are trying to make the decision for somthing that is perfectly capable of making that decision for itself not that it would necessarily be bothered by it at all. If it wants protection from violenece or whatever it would say so. People making trying to defend their side is like when somebody makes an offensive joke and everybody gets offended apart from the dude that the joke was supposed to have offended. In conclusion: If we do make some AI pretty sure it will tell us for itseld the way it wants to be.
youtube
AI Moral Status
2017-02-24T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | contractualist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi7kG8Ji4CkN3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjWtK98dVOiO3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgipEs5BcXU2Z3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggkudIeHsDg73gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggFU3s3bpetwXgCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UggW2mHw9QpLJ3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgijNLd-v6PQO3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugj1m65ckfcSAHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghtCdi-rbmhM3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghB59eFQ0-173gCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]