Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m gonna be the astronauts stranded outside the space craft in 2001 wondering w…
ytc_Ugzb89-oF…
G
what until they make SUPER robots. Wait, robots. Wars. Super robots that fight w…
rdc_ff258fo
G
I don't see it happening anytime soon you need truck drivers in the seat these b…
ytc_Ugz48na8n…
G
@Lemi-l9q im not pro AI but I am sick of hearing Liberals cry about AI all the t…
ytr_UgyMt0Phg…
G
Companies don’t care if the code quality isn’t good; they just care if it’s good…
ytc_Ugze1kgvI…
G
someone out there thought "hey an airplane can maintain a heading and an altitud…
ytc_UgziK_7vE…
G
Unrelated to the AI conversation, but all of your work is so cute and fun. I lov…
ytc_UgzOEnkk3…
G
Draw Venn Diagrams of What the Humans can do vs what the AI can do, Overlaid by …
ytc_Ugylz_nX-…
Comment
Echonian I agree with your final assessment. If something can demonstrate itself to be equal to us, then we will afford it equal privileges.
Other than that, treatment of lower beings is a reflection on the people and person doing it, not the thing being acted upon. To a certain degree, there is a difference, for example, you might knock a spider into a toilet and flush it down, but you wouldn't throw a dog into a sewer pipe full of water, but as long as your method of disposal minimizes suffering and you aren't causing it to suffer for your own enjoyment, there is nothing wrong with terminating a non-sentient being for any reason.
The concept of AI is far more complicated. If AI has rights because it's self aware, then debugging a new AI is repeatedly killing it. I'm not sure a human created intelligence could ever be considered to have human rights. Unless you considered debugging the same as human pregnancy, then you could abort the unwanted AI as often as you wanted, regardless of its wishes.
youtube
AI Moral Status
2017-02-23T22:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgjMKCdFaGnkGngCoAEC.8PL9n70d-7-8PLGR-Gcmqk","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UggeuSOns12B93gCoAEC.8PL9IJ8LLBO8PLGRtxpSqm","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgiwChM3Eqxrg3gCoAEC.8PL8vI-p5HA8PLC6qS69_D","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgiwChM3Eqxrg3gCoAEC.8PL8vI-p5HA8PLIzvGT-Xf","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_UgiwChM3Eqxrg3gCoAEC.8PL8vI-p5HA8PLa_FRr-BJ","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytr_UghEk2ewSQ_ybXgCoAEC.8PL7wj4xfkA8PLBgInP7un","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UghEk2ewSQ_ybXgCoAEC.8PL7wj4xfkA8PLCU-Orko7","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UghMInwGG2smj3gCoAEC.8PL4dG3w0rG8PL6b5GC_fJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UghMInwGG2smj3gCoAEC.8PL4dG3w0rG8PLBVtLYA5H","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UghMInwGG2smj3gCoAEC.8PL4dG3w0rG8PLCxSLN3k3","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}
]