Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love your art btw I have been getting into art and found trends that people ha…
ytc_Ugx63hY67…
G
I use AI when I do a search and want a more in depth answer, and then asking cop…
ytc_UgxgcE43M…
G
For AI there is literally no effort whatever involved. For the human, the amount…
ytc_UgwdGmmyr…
G
School has became easy for kids with AI tech and all this other tech they litter…
ytc_UgwThoMkT…
G
After someone working at tesla got his back almost ripped out by one of their ro…
ytc_UgxP_KnoC…
G
"Digital art is just like AI"
Yeah, sure. And downloading a premade model and p…
ytc_Ugy91xmVQ…
G
Worst thing is he can’t describe AI much better than “umm, it’s like a brain but…
ytc_UgwsHBx1l…
G
@davidrink1291 but in comparison to AI-powered we human being even don't have a…
ytr_UgyrzXf25…
Comment
This assumes it gets to the point of legitimate conciousness and/or autonomy.
The reason it causes issues with humans is because we are consious creatures. Not showing empathy to a mailbox after it got hit by a car never hurt anyone. But not being careful that there aren't any jagged pieces of metal on the mailbox after you fix it up definitely has. If AI does become conscious or convincingly consious imitating humans then that would become a problem. However, expecting empathy is a human characteristic that an AI might not need. An "AI" language model doesnt have neurochemicals to make them happy or sad or angry. An AI wouldnt be able to feel upset over it's place in the world unless we simulated human feelings in the program.
I guess all that is still far in the future though. (Hopefully)
youtube
AI Moral Status
2025-05-24T08:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgyQfHtmGnw9kL0sbsp4AaABAg.AJD-vAKVA4QAK9VKOEiqK3","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzukWbOyPucVjtXflF4AaABAg.AIsHOhdrxkCAIuhUOKrKDK","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzukWbOyPucVjtXflF4AaABAg.AIsHOhdrxkCALHm3MZo0a6","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugy3anUeUrp_s4BhaC14AaABAg.AITl2pKBnX5AIVOD_vZrgS","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxxOagt1Ac8e_16kkJ4AaABAg.AIMGvoiATsdAIVPnx7KVa0","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzSb3hJasAKZuIRV354AaABAg.AIJq_0Kg-2CAIVQHEVQ3K3","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwucDHFz5TMSdF_Vhp4AaABAg.AFplo8jx11AAFq9HiRrfpd","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzGaRixydj-yPm3W2t4AaABAg.AFc7wU956kcAFoUzqCGFqx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgwLZpRcgfvKUtEaEp54AaABAg.AFc58pCec5BAFc6atY-xTe","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx4k244ZcBlF1kJ9fp4AaABAg.ADpKrXtIKVGADqW9NNhZeT","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]