Here’s yet another piece about how some tech gurus are afraid of artificial intelligences:
Here’s the problem with this kind of thinking. It assumes technologies happen in a vacuum. This is the sort thinking that happens when somebody gets the idea that technologies happen all on their without rhyme or reason for existing. Somehow stuff just happens without any input from people or market needs. This is the sort of thing you see in movies, usually with the evil corporation just doing stuff because they feel like it.
The reality is that that is not the case. A technology is developed to fill a need. Steam engines, for instance, happened because water needed to be pumped out of mines and mine owners were willing to pay big money for somebody to come along with a machine that could do that job. So an AI would need somebody willing to pay for it and like any new technology the first one is going to be hugely expensive. Anything that expensive is gong to be dedicated to a job and probably not going to be used for anything else. So we can’t expect cheap AIs for a long time.
Then theirs the other side of the equation. The technology to make the tools and materials to create an AI don’t exist and with the current technologies may never exist. Right now we are reaching the end of CMOS on a silicon chip technology. Moore’s law is going to come to a screeching halt, probably sooner than anybody expects. The reasons for this are economic. Silicon chip manufacture is based on a machine called a stepper. What a stepper does is act like a projector that focuses a beam of light through a mask to make a resist on the silicon chip. This resist is then etched to make the transistors and other components on the chip. Here’s a video of the process:
The stepper machine was created first by Perkin Elmer right down the road from where I live and the company descendent of the original Perkin Elmer Lithography, ASML, is still nearby. In fact I interviewed with them a while back and got a close up view of the technology. Here’s a playlist of the sort machines we are talking about:
These machines are getting more expensive and complicated, to the point that it’s more than likely that the chip foundry industry has already reached the point where even significant investments will not bring the same level of returns. On the other side of the equation is how many people are going to pay for the additional computing power. We’ve already reached the point where all the supercomputing stuff that everybody was talking about in the 1980s is already available on most people’s desktops. There’s also the fact that, other than phones, there have been very few new killer apps that require high end hardware to come along in the last decade or so. So the pressures on the chip industry are more than likely to be getting chips of the same resolution out cheaper rather than constantly pressing for higher and higher resolution. Moore’s law has probably run into reality already or certainly will in the next chip generation or so. An Intel video:
There’s also the fact that designing processors has reached blockage. Processors lately have increased their power by adding more cores, not more transistors. But the connections between the cores are limited to prevent overcomplexity. There’s also the fact that processor efficiency goes down as the number of registers that the processor has to handle goes up, which is another reason for running multiple cores.
So if you can’t get to AI on a chip, could you get there by making a great big machine. No, not really. The problem with supercomputers has always been datalag due to the speed of light constraints and switching times. So big machines have to work around those problems by synchronizing processes and having fairly long wait cycles. This is something that doesn’t work very well in an AI environment. This sort of thing gets even worse if the AI is spread across a network, where most of the machines are of limited processing capabilities. So no Collosus type machine intelligences.
If current manufacturing processes aren’t going to get us the processing power required are there likely to be any new technologies that can resolve the issues. Could 3D printing, for instance, allow the creation of the layered networks involved. That’s always possible but I think there are tool issues. Here’s a case in point:
I think that that the tools needed don’t exist yet and the intermediate markets for those tools and the technologies created. Nobody really even know what the technologies that will be created will do or what people will need them for. Technology isn’t linear and disruptive technologies tend to come from left field and surprise everybody.
The tools are not the only issue. The “software” that constitutes intelligence isn’t something that’s really understood. There’s a great deal of effort to change that, but the big difficulty is that you can’t just open people up and probe the hardware to figure out what’s going on in there. Better sensors are allowing a better look at what’s going inside people’s head , but the knowledge base is still in it’s infancy.
Frankly even if AI’s become available it’s not going to anything all that scary, at least to the people living with the technology. Technologies don’t just exist by themselves in a vacuum. Just as steam engines were surrounded by a raft of complementary technologies. The steam engine required the ability to cast iron, make brass, crucible steels and machine tools. The engines powered better pumps and the machine tools. Likewise AIs will be surrounded by other technologies. An AI civilization will more than likely look more like “Ghost in The Shell” than the “Terminator.”
If that happens the only question left is should you connect in or not and what happens if you do?