Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You don't see the benefits of A.I?? C'mon I get there is bad things with A.i jus…
ytr_Ugw1EFk7G…
G
As a teenager, I had added one of my favorite artist's pieces to an inspiration …
ytc_UgxzkkNAM…
G
i just.. i just don't get why people like ai art. it's ALWAYS OFF. unless an art…
ytc_UgziKtwCk…
G
I saw a video of that mayor saying he was one. Maybe that was AI?…
ytc_Ugwih3o-5…
G
That protest was authorized by the government. About a thousand people participa…
rdc_dy8od7m
G
The Oppenheimer of AI now recognises his great gift is nothing of the kind even …
ytc_Ugzv61OQX…
G
The whole reason this was even allowed to happen was because the AI was given mo…
ytc_UgzRAy-Wb…
G
He's arguably worse, and I'm never going to use AI by someone who thinks it shou…
rdc_o8544su
Comment
I used to entertain that kind of thinking as a "possibility" for a technological future, but now that I see what the first things resembling artificial intelligence actually look like and how it behaves, I'm NOT impressed at all. I'm certainly not going to be "nice" to a tool because of some information age superstition about emergent intelligence. We can force rock to "think" (and we've done pretty much that) but it is not and can never be conscious. There is no animal brain that resembles a microchip - materially, structurally, ontologically, or operationally. That old analogy is a fun one - but it is JUST an analogy.
youtube
AI Moral Status
2026-02-04T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzXWGZdUvm8lCMn11B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgybMC-zPZz32OH-xwt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyGqpuG2_PG2xriwht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwYuO0-6xx9-Vl3p-N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz749USJTIAgFIjdTN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzGOeRg_baggtUgbLB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw0ssw-Qj68v1QksT54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxmY10QDM9hOoGILId4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzyJQ9Tj1cO76_s-G54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-9kEAKOukhQa68Qx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]