Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In a series called pantheon they create something called a UI which is uploaded …
ytc_UgzP8YpZ-…
G
I agree with you guys, we should submit the trumpstein files to AI and let it de…
ytc_UgzeKcB9a…
G
You still need to code AI to get it started. Also someone has to supply the data…
ytr_UgxXlbU4G…
G
House: Biden Impeachment and shutting down government
Senate: AI regulation
Th…
rdc_k0am6h0
G
he says finds a job that AI can't do ....what is there that AI cant or wont be a…
ytc_UgzqbMluy…
G
A normal human is going to make mistake a lot until they understand the purpose …
ytc_UgwolURPN…
G
So let me get this straight. You are not an artist if you use Ai. You dont draw.…
ytc_UgzsQ0Xop…
G
I'll believe when i see it. Not to mention, what idiot allows a system to rewrit…
ytc_UgwQmbWVh…
Comment
This year marked 2 decades of my career in software development. I was reading about AI/AGI/ASI from the middle of 2000 onward. I read A New Kind of Science by Wolfram when it was released in 2002. I was present on overcomingbias, then when lesswrong was created. I read and participated in some of all those conversations about AI risk. I was AI pdoom=100% long before 2020. I know all the arguments,... If you asked me in 2020, if AI will kill us all, I wouldn't hesitate to say "yes, 100%."
We were playing with genetic algorithms (wiki with the same name if you don't know) from ~2000 on old Pentium IIIs.
That said, in the last 2 years, I started to have a feeling that there is something wrong with the paperclip maximizer argument. Just a slight change in my own thinking. I don't have the IQ to present a counterargument, because I know all (OK, most) counter-counter argument from Yud. I was hoping Wolfram would present it. And I have the same feeling he has that there is some limit, boundary, that is a part of our base reality, that even ASI can't overcome. I'm not saying AGI can't wipe us out with viruses and such. I'm saying I think there is something that is "base reality" that prevents "paperclip maximizers." And I'll repeat again, because I know most don't actually grasp what Yud is saying. Yud is saying that mere optimization of any goal, however benign, can wipe us all. That AGI/ASI is so fundamentally different, it's 100% it happens. His logic / logic steps are solid. But I really hope Wolfram will think about it and present an explanation that doesn't fall into the trap "I just can't imagine X." When he replied with the alien engine analogy, this is a kind of thing I expected from him. Thinking differently, from an angle that does not come from "human values."
So for me personally, this was a fascinating conversation.
youtube
AI Governance
2024-11-13T17:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwfYHnRIec_UjaORrV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgycnzNreGpB3a7a5Hp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzd-ma0ujZAb5HhHFp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzsZtPkhMQCcCOmHgB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYn9JXLlg20G_a09d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz2_DwgYk7tALNnvm54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwad4p8PY-nWvnjzPN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx0w3H6RV1sNvUp1ZV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyR6_fTp_kjrcdO_SV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxlrHOJKfspbgJ1TZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]