There is no such thing as a free lunch. But for years, that was how computer programmers were able to code. Computer hardware was getting better so quickly that a computer programmer could write mediocre code and count on the hardware to make up the time. As computer chips continued to follow the famous Moore’s Law, where the number of transistors on a computer processor chip roughly doubles approximately every two years, and as clock speeds continued to increase along with transistor count, it seemed computer hardware would continue to outstrip programming indefinitely.
However, around 2005, computer clock speeds plateaued around the 1 to 3 gigahertz range. That is, the number of instructions a computer could process per second leveled out at 1 to 3 billion. That may seem like a large number, but not in the world of computer programming. In order to play a video game, computers run trillions of lines of code. Inefficiencies in the code can build up to cause seconds of delay, which is unacceptable to the modern computer user. And with hardware no longer increasing in speed, the computer programmer is finally without a free lunch.
Computer programmers have found ways to streamline their code, following algorithms designed to be as efficient as possible. Computer engineers have used the continued validity of Moore’s Law and the decreasing size of the transistor to create multiple core processors, which allow for instructions to take place simultaneously, even though the clock speed has remained constant. But engineers predict that Moore’s Law is only a few years away from failing, and so they are working to develop new ways to meet society’s demand for smaller, faster, more powerful computer chips.
One way engineers are seeking to decrease computer sizes and increase speeds is by finding more efficient ways to store memory. Memory has always been tricky for computer engineers. Stable, non-volatile memory is notoriously expensive and slow to access. Computers spend hundreds of clock cycles waiting on data from non-volatile memory. Non-volatile memory retains its information even when the power turns off. Volatile memory like RAM, random access memory, requires constant power to maintain its information storage, but is faster to access. The different uses for volatile and non-volatile memory are the reason that students lose changes made to papers if the computer shuts off before they press save. Though faster, memory like RAM can take up 50 to 80 percent of the chip area. MRAM, magnetic RAM, is not a new technology. It was first introduced 25 years ago as small, quick, non-volatile memory, but was difficult to implement in most mainstream applications. A form of MRAM called STT-MRAM, Spin-Transfer Torque MRAM, is causing a stir today for being ideal for modern applications. It is smaller than the commonly used RAM, uses less power, and is non-volatile but still fast. STT-MRAM is one example of how engineers are constantly modifying good ideas and finding ways to improve performance.
Another form of non-volatile memory researchers are attempting to modify for use in modern-day computers is ferroelectric material. Ferroelectric material has a natural polarization that can be switched when an electric field is applied to it. That is, the material has buildups of positive and negative charge which can be flipped by applying current to them. These polarized charges can be read as a 1 or 0 by a computer, allowing memory to be stored in the material. And like STT-MRAM, the data will last even without constantly applied power, so data will save even in the event of a computer shut-off. Ferroelectric material has been used in a number of applications already, like transit cards for bus fare, but they have historically been too slow for use in integrated circuits for computers. But researchers at the University of California-Berkley and University of Pennsylvania have been working to find ways to make the switching of polarity in ferroelectrics faster — fast enough to be a strong contender for the computers of the future.
These are just a few examples of how scientists, engineers and programmers are working to keep technology advancing. There is a lot of work that goes into each advance in computing, and innovation, creativity and hard work are becoming increasingly important in shaping the industry. Sometimes, old methods that seemed they would never work can prove, with just a few changes, to be the next exciting possibility.