Honesty Rocks! truth rules.

What Is Quantum Computing? An essay

HOME      >>       General Discussion


I wrote this essay for a science writing contest/school about quantum computing. It's a little long for a forum post, but I think I put a lot of information in there. Enjoy!

Moores Law states that every 18 months, the number of transistors in a computer chip doubles, thereby increasing the speed of the computer (Arthur, 2005). In order for computer chip manufacturers to fit all of the extra transistors, the transistors themselves must become exponentially smaller. However, there must be a limit: Transistors cannot keep shrinking in size forever, and some scientists estimate that this limit will be reached sometime between 2010 and 2020. At such small sizes, electrons will begin leak out of the circuits (Arthur, 2005). So what does this mean for computers? Will they just stop becoming faster and effectively halt the progress of technology? The short answer is no.

Recently, scientists have been studying and developing a new kind of computer that works at the quantum level. Instead of having bits made of transistors, quantum computers have quantum bits (qubits) which can be made from almost any particle. The favorites so far have been atoms, photons, and electrons (Gomes, 2005). Traditional bits, those that are used widely today, can store either of the values 1 and 0. However, because of the strange things that happen at the quantum level, qubits can store either the value 1, the value 0, or both of those values at once. This combination of two values is called a superposition, and physicists say that

any closed quantum system has a superposition of all possible states

(Arthur, 2005). In theory, this means that the qubits are in every possible binary configurationat the same time.

It's as though your computer was simultaneously doing every calculation you'd ever asked it, or ever would, or could

(Arthur, 2005).
This ability to hold many values at once in just a handful of qubits is what makes quantum computing so special. The power that quantum computers would have both frightens and excites people in many fields. One of the most directly affected fields would be data encryption. Right now, data encryption is based on multiplying extremely large prime numbers (Arthur, 2005). The idea behind this method is that it would take centuries to factor the resulting numbers on a modern computer because of the inefficiencies of even the best factoring algorithms. However, the superpositions of quantum computers allow them to use special algorithms to factor large numbers extremely quickly. This makes it seem like quantum computing would be bad for people who rely heavily on encryption. However, because of another special property of qubits, a more powerful technique is available.

Quantum cryptography, a new idea in data encryption that seems to fit in well with quantum computing, utilizes an idea called quantum entanglement. This says essentially that it is possible for two or more qubits to become entangled, so that anything that effects one will effect the other, too (Begley, 2005). The application for quantum cryptography is as follows: Person 1 has two sets of photons which are entangled with each other and contain an encryption key. Person 1 then sends one set to person 2. If the photons make it safely to person 2, then person 2 can get the key and decrypt the data. However if they are intercepted, then the photons that person 1 has will be disrupted and no longer be useful (Begley, 2005). Because of this, quantum encryption is much more secure than current methods of encryption. In fact, it is nearly impossible to hack.

So why has this not been done yet? Well, it is not as simple as it sounds. Though a full-scale quantum computer is theoretically possible, it would require thousands of qubits, and so far we lack the technology to achieve such a feat (Cho, 2005). However, quantum computers have been made on a very small scale. IBM made several machines between 1998 and 2001 with three, five, and then seven qubits. These have been able to make simple calculations such as factoring small, two-digit numbers (Arthur, 2005). Impractical as these machines are, they mean progress. Unfortunately, there are other limiting factors: A qubits ability to store information deteriorates over time. Furthermore, the smaller the qubit, the quicker it decays.

Currently the most promising qubits are likely to be able to store information for around one second, but that is unlikely to be long enough. Larger qubits will be able to hold information for longer, but will defeat the object of a quantum computer being small and fast

(Obstacle for Quantum, 2005).
For all this, though, the biggest problem with quantum computers right now is mostly that we just have not thought of a good solution yet. After all, what it really boils down to is a new idea. Thousands of things so far in human history have been labeled impossible, but then someone thought up an ingenious idea and now many impossible things are taken for granted: Computers, television, even electricity. So it is really not too far-fetched to say that quantum computers will be relatively commonplace in the near future, in fact it might be wrong to say that they will not. We only need for someone to wake up one day, shout Eureka! and build a practical quantum computer. Until then, we think.

Arthur, C. (2005, January 26). The encryption factor. The Independent, p. 11. Retrieved October
22, 2005, from ProQuest database: http://search.proquest.com/

Begley, S. (2005, October 14). Even scientists marvel at 'spooky' behavior of separated objects.
The Wall Street Journal, p. B1. Retrieved October 22, 2005, from ProQuest database:

Cho, D. (2005, September). Quantum Computing. Technology Review, 108(9), R&D 2005.  Retrieved October 22, 2005, from Massachusetts Institute of Technology Web site:

Gomes, L. (2005, April 25). Quantum computing may seem too far out, but don't count on it. The
Wall Street Journal, p. B1. Retrieved October 22, 2005, from ProQuest database:

Obstacle for quantum computer. (2005, July 14). The Guardian, p. 6. Retrieved October 22,
2005, from ProQuest database: http://search.proquest.com/


DNA computing

From Wikipedia, the free encyclopedia


DNA computing is a form of computing which uses DNA and molecular biology, instead of the traditional silicon-based computer technologies. A single gram of DNA with volume of 1 cm? can hold as much information as a trillion compact discs, approximately 750 terabytes.


This field was initially developed by Leonard Adleman of the University of Southern California. In 1994, Adleman demonstrated a proof-of-concept use of DNA as form of computation which was used to solve the seven-point Hamiltonian path problem. Since the initial Adleman experiments, advances have been made, and various Turing machines have been proven to be constructable.


There are works over one dimensional lengths, bidimensional tiles, and even three dimensional DNA graphs processing.


On April 28, 2004, Ehud Shapiro and researchers at the Weizmann Institute announced in the journal Nature that they had constructed a DNA computer. This was coupled with an input and output module and is capable of diagnosing cancerous activity within a cell, and then releasing an anti-cancer drug upon diagnosis.


DNA computing is fundamentally similar to parallel computing -- we take advantage of the many different molecules of DNA to try many different possibilities at once.


For certain specialized problems, DNA computers are faster and smaller than any other computer built so far. But DNA computing does not provide any new capabilities from the standpoint of computational complexity theory, the study of which computational problems are difficult. For example, problems which grow exponentially with the size of the problem (EXPSPACE problems) on von Neumann machines still grow exponentially with the size of the problem on DNA machines. For very large EXPSPACE problems, the amount of DNA required is too large to be practical. (Quantum computing, on the other hand, does provide some interesting new capabilities).



If his whole post was an essay, shouldn't it all have been within quote tags? :lol:IF you read far enough into computing in general, you'll see that having more powerful, spacey enabled computers in a compact enviroment, isn't too far away. I think by 2010 everyone wont have changed that much, but things will be alot smaller.


Lol, I wonder how the person who shouts Eureka when he wakes up and then a magical quantum computer is going to happen.Great article, its always nice to know how the future is going to turn out, though its still a bit confusing, the part where it is a 1 but then its a 0 and then its both? I read this article somewhere else about a week ago but it was really really confusing. It said it had to do with teleportation, and that scientist were already able to teleport bits and stuff within a computer. I dont know, I guess we will just have to wait and see how it all turns out. Cheers!


...its always nice to know how the future is going to turn out, though its still a bit confusing...


You know, I think Neils Bohr (not sure of the spelling, sorry), a brilliant physicst who had a lot to do with revolutinizing the physics of his time (I think we're talking early 1900s here...) had once told a student of his that if you don't feel dazed/giddy after a quantum physics lecture, you have not understood anything of quantum physics. According to Feynman, while only a few people get relativity right, no one has truly understood quantum mechanics completely.


So... join the club :lol: (a rather elite one, at that. :P)


If his whole post was an essay, shouldn't it all have been within quote tags? :lol:

lol you are wrong, in fact, quotation is a good way to make reference to another studies and works, so you respect the source of the information


Wow, thanks for that. I've never really understood quantom computing until reading that.However, could you encypt using quantum computers to end with the same problem as we have now with it taking a long time to factorise large primes/other mathematic number used for the future of encypting


If his whole post was an essay, shouldn't it all have been within quote tags? :angry:

Take what keysmaker said, add the fact that everything that's not in quotes is my original text, and the fact that I asked a mod about whether posting an essay would be alright. But if you still want to argue...


quantum computing is a combined idea classical information theory, computer science, and quantum physics. It says that information theory and quantum mechanics are same in many manners..the concept can be brought in light with the help of classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines(automata) and computational complexity.in near future human brain can also be read with the help of quantum computing.