Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You guys are arguing for pro choice AI or pro life ai. Can you just unplug it if…
ytc_UgxMvgs83…
G
Wall-E is literally best case scenario in an ai future. Like it’ll use up all th…
ytc_Ugx2Wh3oB…
G
The same dude who had 7 months in the insurance industry and passed lv 1 of the …
rdc_fn5stcx
G
Try asking Chatgpt4 to play a simple word game like 'Squares', explain the rules…
ytc_UgySDWo02…
G
"This is 911 what is the location of the emergency?"
"Hello this is ChatGPT, i h…
ytc_UgyPTok1f…
G
Artists are threatened by AI in that corporations don't have to pay them if said…
ytr_UgxHGIOEX…
G
you get little glimpses into lex’s mind when he talks, for example, he measures …
ytc_Ugyxhagj0…
G
My Facebook post. Not that anyone's going to read it anyway.
Hey everyone. Have…
ytc_UgyHEGkxX…
Comment
I am honestly less worried about that.. I personally think what it will come down to is communication.. people dont consider animals anything special because there is a huge difference in comprehension and communication.. they seem on the surface to lack it aside from basic grunts screams roars and other core emotional responses.. it also doesnt help that many middle eastern faiths tend to say we are higher then animals and they are ours too do with what we will basically. which these faiths are the dominant ones. this is also the reason there would be an uproar to any sentient robot.. however most would be accepting of them if they were able to communicate... this is the same reason you can connect with an alien on a scifi show.. they communicate on our level or higher.. we respect them as equals or more because they have proven to be that.
however! the biggest problem is the distrust.. machines wont think like us unless hey are programmed too.. and even then whats to stop reprogramming.. what happens if a software glitch happens.. will it be like terminator, irobot, or that one movie that came out in 2015.. I forgot its name but the robot wasnt evil.. however it wasnt good thus human life might not have any meaning too it.. the big issue with robot rights is not seeing it as another being that is worth the rights.. but rather is it s being we can trust to be apart of our society or will it aim to destroy ius or even control us because it sees us as inefficient, broken, counter intuitive, and hypocritical.. we would be the complete opposite of a machines logic.. what would be black and white too it would be grey and take us much longer to decide the stance.. or better yet we would never agree.. look at how many things in our history could of been from people like tesla that never made it because of concepts like money.. A concept a machine might deem worthless.. it would see a job and say it needs to be done so we need the resources.. which means we need to get them however possible.. whether it be space mining or what have you.. but this complete difference in thought process is the scary part and the plot to most movies that have the machines take over or attempt too..
distrust is the problem at least with social acceptance.. not as much it being sentient and considered another being. now getting into the government that is a whole other issue and probably the reason machines would kill us all.. or try to rule us.. lol.
youtube
AI Moral Status
2017-02-24T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugixaj93h0Q5xXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghRdXH0RQOp8HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh-SjVAq9zdx3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggZCcJUMZuFnXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugg4gAkvZdg7z3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj9wGJPXC_hu3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggIZ1W19SNryngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgicOMwNotsRh3gCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBXLlsrBabe3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggXJxSl99YodXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]