Part of:

Why Quantum Computing May Be the Next Turn on the Big Data Highway

KEY TAKEAWAYS

Computer technology has progressed along the same path for decades, but quantum computing is a huge departure from what came before it.

On September 28, 2012, the New York Times ran a story, "Australians Surge in Quest for New Class of Computer," concerning what appears to be a breakthrough in the race to build a working quantum computer.

Advertisements

While the definition of a quantum computer will allude many readers, suffice it to say that a working quantum computer will be revolutionary in the world of technology.

Computer technology underlies the changes in the world that we have experienced in the last 50 years – the global economy, the internet, digital photography, robotics, smartphones and e-commerce all rely on computers. It is important then, I believe, for us to have some basic understanding of the technology to understand where quantum computing may be taking us.

Advertisements

In the Beginning, There Was ENIAC

So let's start at the beginning. The first working electronic computer was the Electronic Numerical Integrator and Computer, more commonly known as ENIAC. It was developed at the University of Pennsylvania’s Moore School of Engineering under funding by the U.S. Army to calculate gunnery trajectories in World War II. (In addition to being an engineering marvel, the ENIAC blazed the trail for many major IT projects in the years since, but it was too late for World War II, which ended before the computer was completed.)

The heart of ENIAC’s processing capability was vacuum tubes – 17,468 of them. Because a vacuum tube has only two states – off and on (also referred to as 0/1) – computers adopted binary arithmetic, rather than decimal arithmetic, where values go from 0 to 9. Each of these individual representations is called a bit, short for "binary digit." (To learn more about the history of the ENIAC, see The Women of ENIAC: Programming Pioneers.)

It was obviously necessary for there to be some way to represent the numbers, letters and symbols that we are familiar with, so a coding scheme proposed by the American National Standards Institute (ANSI), known as American Standard Character Information Interchange (ASCII), eventually became the standard. Under ASCII, we combine 8 bits to form one character, or byte, under a predetermined schema. There are 256 combinations representing numbers, upper-case letters, lower-case letters and special characters.

Advertisements

Confused? Don’t worry about it – the average computer user has no need to know the details. It is presented here only as a building block.

Next, computers progressed fairly rapidly from vacuum tubes to transistors (William Shockley and his Bell Labs team won the Nobel Prize for the development of transistors) and then the ability to put multiple transistors onto one chip to create integrated circuits. It wasn't long before these circuits included thousands or even millions of transistors on one chip, which was called very large scale integration. These categories: 1) vacuum tubes, 2) transistors, 3) ICs and 4) VLSI are considered the four generations of hardware development, no matter how many transistors can be jammed onto a chip.

In the time since the ENIAC "went live" in 1946 and all through these generations, the underlying use of the vacuum tube-based binary arithmetic has remained in place. Quantum computing represents a radical breakaway from this methodology.

Quantum Computing: The Big Break

Quantum computers harness the power of atoms and molecules to process and perform memory tasks at a much faster speed than a silicon-based computer … at least theoretically. Although there are some basic quantum computers capable of performing specific calculations, a practical model is likely still several years away. But if they do emerge, they could drastically change the processing power of computers.

As a result of this power, quantum computing has the power to greatly improve big data processing because, at least theoretically, it should excel at the massively parallel processing of unstructured data.

Computers have kept on with binary processing for one reason: There had really been no reason to tinker with something that worked. After all, computer processing speeds have been doubling every 18 months to two years. In 1965, Intel Vice President Gordon Moore wrote a paper that detailed what became known as Moore’s law, in which he stated that the density of processors would double every two years, resulting in a doubling of processing speed. Although he had written that he predicted this trend to last for 10 years, it has – remarkably – continued to the present day. (There have been a few computing pioneers who have broken the binary mold. Learn more in Why Not Ternary Computers?)

But the increase in processing speed has been far from the only factor in improved computer performance. Improvements in storage technology and the advent of telecommunications have been of almost equal importance. In the early days of personal computers, floppy diskettes held 140,000 characters and the first hard disk that I bought held 10 million characters. (It also cost me $5,500 and was as big as a desktop computer). Thankfully, storage has gotten much bigger in capacity, smaller in size, faster in transfer speed, and much, much cheaper.

The great increase in capacity allows us to gather information in areas that we either previously could only scratch the surface of, or not even delve into at all. This includes topics with a lot of data, such as weather, genetics, linguistics, scientific simulation and health research, among many others.

Making Sense of Big Data

Increasingly, big data exploits are finding that despite all the gains in processing power we've made, it just isn't enough. If we are going to be able to make sense out of this tremendous amount of data that we are accumulating, we are going to need new ways of analyzing it and presenting it as well as faster computers to process it. Quantum computers may not be ready for action, but experts have been watching their every progression as the next level of computer processing power. We can't say for certain, but the next big change in computer technology could be a real departure from the silicon chips that have carried us along thus far.

Advertisements

Related Terms

Advertisements
John F. McMullen

John F. McMullen lives with his wife, Barbara, in Jefferson Valley, New York, in a converted barn full of pets (dog, cats, and turtles) and books. He has been involved in technology for more than 40 years and has written more than 1,500 articles, columns and reviews about it for major publications. He is a professor at Purchase College and has previously taught at Monroe College, Marist College and the New School for Social Research. MucMullen has a wealth of experience in both technology and in writing for publication. He has worked as a programmer, analyst, manager and director of…