Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Itll be about time before someone charges based on AI usage and then unskilled w…
ytc_UgzwHAtWY…
G
Linus Torvalds once said "AI is 90% marketing and 10% Reality". He's def not wro…
ytc_UgxdvrBll…
G
OP wasn't hired to write a script, they were hired for certain results, which we…
rdc_hsf2lqi
G
When is AI gonna start making more jobs then it’s deleting. I’ve only seen compa…
ytc_Ugw0lJrYT…
G
What a surprise.
The steaming pile of shit asshole president is making the for…
rdc_e2wga9h
G
Absolutely hilarious how he used the analogy of listening to the BBC and reading…
ytc_UgxAO0lRe…
G
Guys to any AI artist that is using this argument.. We are definitely NOT born …
ytc_UgyBZ7Mvl…
G
The extremely niche job example is my literal job. You say the AI would dent car…
ytc_Ugy9GeAyf…
Comment
A no point should a construct have HUMAN rights. If and when they are PROGRAMMED to have some measure of consciousness, it is merely a set of instructions in their programming telling them to acknowledge the presence of something, that is identified by their program as being present.
There is no actual "self awareness" in the machine, there is only a programmed instruction for the device to follow and it is still no more sentient than a brick.
The false parallel(s) to enslaving robots is just that, a false argument or strawman. A robot, even with a program to appear as "self awareness", is no more TRULY sentient than a hammer... And I guess we're going to have to give hammers human rights if a robot gets them...
youtube
AI Moral Status
2017-02-23T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugj-GG9HRZn1i3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBGDS4uJuvI3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiqMqcGlJ3kTXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggBfEyjI40hpHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugiq0IRMB0CCh3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh-w9BtvkjcungCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugjqh3cjpA79VXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj0_G0fNIn_JngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj2N050ddsTSHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgizsfMfud5iSXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]