Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
class IndependentChatbot:
def __init__(self):
self.rules = {
…
ytc_UgzYXityl…
G
Finding out Neil doesn't worry about AGI makes you rethink the hype. I actually …
ytc_Ugz8YLPih…
G
AI is overrated because NDT is a boomer who doesn't really understand it but is …
ytc_Ugx3xQJ-f…
G
On one hand, everything public or private entities create can be consumed by a l…
rdc_lamr2dp
G
I think that we don’t need to be afraid of developing AI super intelligence. AI …
ytc_UgwCTRIlx…
G
Yeah even quantum computers wont hold the secrets to unlocking AGI. Unless we ha…
ytc_UgxqpJC-a…
G
Everybody knows this is fake right? They turn the person who actually did it, an…
ytc_UgzGu5uFz…
G
For all those who fear the possibilities in the future remember these following …
ytc_UgytReOCg…
Comment
This was one of the weak points for me as well. I saw the proof-of-work blockchain as a wasteful enterprise because crypto mining was so energy intensive compared to the value it was generating, especially compared to conventional payment systems.
LLMs might be very costly to train, but that only happens once, and the cost of that training is spread across all the billions of times it is used to generate an enormous variety of useful things, far more useful than just "jokes". If an LLM is used to replace a human at a job, what is the total carbon cost of raising that human and keeping them alive, just so that they could read a PDF and answer some questions? That's the real comparison. Seems like a very reasonable tradeoff to me.
youtube
AI Responsibility
2023-11-06T12:5…
♥ 59
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugyg82mTX_D-Hay4UlV4AaABAg.9wm9MmyKkTh9wmRponcdGI","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugyg82mTX_D-Hay4UlV4AaABAg.9wm9MmyKkThA-Fbut9QgPr","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyvAuyoxsti5Znfu994AaABAg.9wm86OHQc7m9wmr8WVRYQW","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgyvAuyoxsti5Znfu994AaABAg.9wm86OHQc7m9wnDBLybBgW","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw4nBsUJVAVu0YW2zx4AaABAg.9wm4qB3hP-U9wnipWYcY0M","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_Ugw4nBsUJVAVu0YW2zx4AaABAg.9wm4qB3hP-U9wwwNdPq-8h","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw1urvKyb3Es0isIhJ4AaABAg.9wm1oY8G2kV9wnF70lJAyF","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzRi1Q9lbVf74JFdhV4AaABAg.9wm0ylfHQHSA65ygAMUJiQ","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwgkiNLF6XFYw49aZF4AaABAg.9wm0wFg_WW79wm2phalCeF","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwgkiNLF6XFYw49aZF4AaABAg.9wm0wFg_WW79ww0gUoLl14","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]