Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dont agree with deepfakes.
But were getting to the point where ... OMG!. Some…
ytc_UgzNg7mNy…
G
We have lived in a post-scarcity society for *decades* now.
The only thing that…
rdc_fcswha5
G
I am a teacher of nine years. I now seek out AI advice for speeding up resource-…
ytc_UgxVLI9yH…
G
Most AI is the same shit as ask jeeves from 20+ years ago. Data regurgitators w…
rdc_ofhtt68
G
If a company developed the whole algorithm and all of the sample data themselves…
rdc_jwvfhrk
G
Have artists thought about pursuing this legally? Especially when there is proof…
ytc_UgzPG7l3p…
G
Clever AI Humanizer (100% free) is very helpful and highly recommended, the resu…
ytc_Ugyf0dImQ…
G
I imagine alot of the AI stuff would be closed systems which would make it prett…
rdc_ohsy5x4
Comment
Isaac Asimov's three rules of robotics should be a good base for programming robots:
First Law: A robot may not injure a human being or, through inaction, allow a human being to come to harm.
Second Law: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
But, if and when, robots become self-aware and demand rights.. I fear that Animatrix is going to be a prophet for the future humankind. 🤔
Every creature, that is self-aware, should have rights.
youtube
AI Moral Status
2021-12-23T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz1gKUCEMWwCARjHox4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwYHC1hH8ZXTJg-FDB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5aYamGjLJGN7npVp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxWLQNzzbrUeVItFwl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxpCvZx8XsSV4GHt854AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzV-9b1bUCzsgeg7AR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyxTnewRgSJmaseNAV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugww_b_teocVEo2uuHB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxB1xJbGcmPqASVsi94AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxEjH_zbRBLuZiRR_h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]