Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Cavemen painting mammoths on cave walls and toddlers scribbling with crayons are…
ytc_UgwBrlR3z…
G
@Potatotutrial My point is that some areas of medicine are less likely to be neg…
ytr_UgzX4_vim…
G
I think it's completely sensible what he says, that AI development needs regulat…
ytr_Ugy2h6jlS…
G
What are the chances this website results in legislation to restrict the use of …
rdc_mzmq7lt
G
Yuo are rigth , thecnkcally Llm are Stocatic parrots that make masive plagiarims…
ytc_UgzPKHWEb…
G
Shortage of real life deep,serious,meaningful conversations with youngsters lead…
ytc_Ugx0Gk5fR…
G
Imagine how more dependent people will be if more AI development happens. If hav…
ytc_UgzAYGAnL…
G
That’s why GxP AI validation is risk-based and the stringency of validation is b…
ytr_Ugx2w8ZZQ…
Comment
Humanity is a self destructive parasite with a god complex. The "genuises" that get the itch on their brain they cant scratch are to he the ones to build the end of humanity. They know they won't be here forever, most genius has less empathy, they'll jave a product of their genius, and since they'll be gone, they don't care what the consequences afterward are.
What happens when people don't have to do anything for themselves? What happens after a generation or two of humans being AI dependent? Humans just don't know when to stop, and are selfish. Poor pairing of traits with all the others.
youtube
AI Moral Status
2025-04-28T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzNotkM78ASHjz4ND54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy6CuFtoB8pbuz24lF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyovjxje96Q6DwuxaN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxrEa9aU8EwT7wsvs14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxF2I6jSWZK6Ar6DpN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwKOcU9pIkAXIGZjhJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxlgVz8I3DEVkggwJV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxe2xCQjdJUWC3RLL94AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzgl5huLGa3gOgZnHx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyeMT9xhg0ahJMkX5R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]