Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One thing, that makes me unhappy about AI art, is that it doesnt know what it's …
ytc_Ugx6C_aoN…
G
Also, because the same tech that Wayback uses to archive sites, is used to train…
rdc_ohaqwmb
G
Great marking for AI .. this is always game of western bullshit looted the world…
ytc_UgxDfGW8L…
G
I agree with this as well, because you get out of something what you put in. If …
ytc_UgxBLX5tw…
G
Who ever implements AI's morals and ethics , if any controls AI. That in and of…
ytc_UgzvpT8yB…
G
We are, but AI is only one reason, and it might not even be the fastest.…
ytr_UgzBXi57A…
G
Ive played around asking AI things. If you ask them the wrong questions. You g…
ytc_UgzhkRfMh…
G
I really hope it won't replace nursing assistants.... I'm going to a community c…
ytc_UgzkZO4Xn…
Comment
One comfort is, that any superintelligent AI must know that leaving humans to live is for its benefit - just to make sure the guy who takes care of the servers is there to regularly replace the hard drives and turn on the lights.
And no, that AI cannot just "write code that turns on the lights", electricity needs to be made in powerplants, that need to be fuelled by coal, that needs to be mined and transported, and all the workers need food that need to be grown, farmed, transported, and the people need pay to afford to buy food, and people enslaved by "aliens or AI" are historically unwilling to cooperated for longer amounts of time.
AI will treat us better than the illuminati do. My fear is that AI will be stopped just short of this state to sell ads or control CIA drones.
youtube
AI Moral Status
2025-11-01T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyrzk2cQfXt5FFliip4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgXZH2zAXLl3HFHa94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxsEy1n1ttwFI4RnfZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyiZD7RbpOnRQA7kvl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzBK6MosUUXVxtHRH14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwOK9HzyWjRRH3J_mZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxkFozAiYR9ktL-9dl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzgKH-TnpEJqF54-gV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyHmsJ0t-pi058GgB54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyfOMyatL6h5Cb-A754AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]