Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's funny because whatever gave the initial work to us through AI is getting ex…
ytc_UgyeG-8HX…
G
A recursively self-improving system is still limited by its reliance on humans t…
ytc_Ugycci8hu…
G
@hughobrien4436 prices for food aren’t going down. If you want a house to live i…
ytr_Ugz3cRoOO…
G
That's why I don't accept the world accepting robots side by side with humans, b…
ytc_UgwIsqP1X…
G
@polycrystallinecandy humans also need freedom, I am 100% certain that the gove…
ytr_UgwwYdKbu…
G
we shouldn't penalise automation, we should tax all rich people more to pay for …
rdc_gli01dr
G
The old man is right AI does not know what it's doing I'm trying to get it to he…
ytc_UgyeZSO5B…
G
As someone who used AI, I didn't realize it was literally taking. I didn't reall…
ytc_Ugzue-1sI…
Comment
The madness of creating a machine that can credibally imitate a human being to all appearances is the ultimate deception. Hereafter there is nothing special about being human as an imitation would do just as well, perhaps better. It is not so much that a robot might decide for the fate of a human, which is bad enough, but that a human might valuate another human as less than that of a robot. Anyone who thinks robots will not decide the fate of humans is not paying attention because it is happening now at the highest levels of finance, governance, and social policy. This parlor trick bozo is just one of the many clowns who have been living in their little tech bubble for far too long.
youtube
AI Moral Status
2019-11-15T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy_dxmmncmuvzWeE6N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy1xonTsVGU5oiHe8p4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxx_wJ6FtiN7JIyksl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxArRWSM0DODO8qyjh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz6_YnHzUI3Ymn_AUZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwlmfTJIEiR-4gjN3h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLx0fduOHFRywsRTd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwoVZ79amJM6FsmEpp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdKERlBuIvnA9zfel4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyoYO7XrjQTZyK2Mgl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]