Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i love the use of Robots typing hahaha the sooner we replace human managers and …
ytc_Ugw10A9VU…
G
My question is if AI outperforms humans by 2030 and the World Economy collapses …
ytc_UgxtjongJ…
G
AI is a nihilistic invention at it's core.
Humans are tearing out the soul of o…
ytc_Ugxri72hR…
G
This feels very much like how cameras and photography would have been received b…
ytc_Ugy-RHGrp…
G
Exactly, consciousness doesnt operate outside formal systems, it simply jumps to…
ytr_UgzuN5Rhb…
G
This is something that gives me horrific anxiety and dread.
The natural world, …
rdc_ofim74v
G
1️⃣ LLMs (Large Language Models) in themselves are not the danger; they merely p…
ytc_UgzcQDYrb…
G
Damn..... Can't find a reason to give a fuck I don't need an ai to do my shit fo…
ytc_Ugwfgpm2e…
Comment
The alignment problem will not be solved prior to AGI. Think about it. That isn't compatible with capitalism. AGI will be the most profitable invention mankind has ever created. We can't even properly regulate oil, and we know for a fact it will destroy us.
You are underestimating the development of AGI. Chat gpt is being combined with other AI systems as we speak. AGI is a decade away at the latest. The Pandora's box that most people assume is far off in the future has already been opened.
youtube
AI Moral Status
2023-08-21T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyk60AkoNrsafE7PkF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyD7TB9IezrJLMfhwd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxppipJBtZVx5L0HAd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwtoSTCvYehSflQk1R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxSlKHSmvlMIQLKmUl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8fuDlfM8JtxS7aQF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx_5qyWVqWCh64-tPd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQ5GCvHQzEecPmbFN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwkKV6Mm2KX3f3Zsst4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwkkaIreG9nzBAc5BR4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"mixed"}
]