Saturday, April 15, 2023

Should We Build a Better God?

    God comes to you in a dream, and says "hey, next Tuesday I will cease to exist. On Wednesday, you are to design my replacement. You will choose the New God's moral principles and decide how active the New God will be in the life of future humans. On Thursday, the New God will take over and you will have no say in whatever happens next." Do you take up the offer? Do you tell him "actually you should make this all democratic, and the public should vote on each aspect of the New God." Do you say "actually I think Sam Altman would be better than me at designing a New God, you should ask him." This is essentially the dilemma we have with ASI.

    Before we getting into choosing values, let's briefly discuss an even harder problem, ensuring that the New God follows through with our intentions. We have to be very careful what we wish for. In a short story called "The Monkey's Paw," a man is granted three wishes. The man first wishes for $200, and then the next day his son dies in a work accident and the family is compensated $200 by the son's company. Some folks at MIRI think that "figuring out how to aim AI at all is harder than figuring out where to aim it," and I'm actually inclined to agree. Both are insanely hard, but trying to incorporate any sort of value system in machine code seems near impossible. This is going to be the most important technical aspect of alignment research, but let's get back to discussing the choosing of values. Frankly, the choosing of values is actually possible and more fun to talk about.

    Now, who should choose? Do we want the vote on the New God's moral beliefs to be based on a vote across the United States? Should it be worldwide, where the population of China and India dominate the vote? Should citizens of authoritarian regimes get a vote? Should members of the Taliban? I honestly don't see how this differs much from the Constitutional Convention. We should probably have something similar for ASI, a conference among nations where we decide how humanity will design the value system of a future ASI. Some of the solutions from the Constitutional Convention will probably be applied. Maybe there are some votes based on pure population and some votes granted to specific countries or regions, similar to how the US has a House of Representatives (number of politicians based on population of the state) and a Senate (two candidates per state). Frankly, this doesn't seem to different than what would be necessary for the formation of a world government.

    A world government is a simple solution to curbing existential risk. It's harder to have nuclear war if there's only only country, and it will be easy collaborate to make worldwide decisions if there is only one government. Assuming this government is largely democratic, it is probably the only feasible way to account for humanity's aggregated moral principles and future desires. There are obviously huge risks of a world government (authoritarianism, value lock in), but it is very possible that it will be established in the future. If ASI is developed, it's going to pretty much take the role that a world government would anyways, as it will be insanely powerful and an individual human will have essentially no sway over anything. A world government and ASI face the same Democracy vs Educated Leaders trade off. There are two options when building a better God:

1. Make this process totally democratic, so that each individual currently on Earth gets a say.

2. A small team of experts gets to decide the future of humanity.

    Maybe this small team is better than the rest of the world at picking between trade offs. Maybe they are more educated, more moral, and better at determining the needs of the future humans who have yet to be born. Or maybe they are authoritarian and will be massively incentivized to achieve god-status and immortality themselves. Regardless, I actually do think we should establish a New Constitutional Convention. Call it the Building a Better God convention. Maybe a majority of the population opposes this creation, and in that case we will have our answer.

No comments:

Post a Comment

Mind Crime: Part 10

    Standing atop the grave of humanity, smugly looking down, and saying "I told you so," is just as worthless as having done noth...