Quantum Computing is becoming quite the hot topic lately. With research being done by Google, IBM, Microsoft, universities, and a number of other players, it’s looking this is really going to happen. In fact, Google may just be weeks away from announcing the Quantum Supremacy milestone. If you aren’t familiar with the concept of Quantum Supremacy yet, it’s basically the point where a quantum computer can complete a computation in a short time where a classical computer can’t complete it at all. This is a big deal. While there are some simple quantum computers out there right now (you can try out IBM’s via the Q Experience), this will be a big deal.
Now, quantum computing isn't for everyone. I'm pretty sure it's not even for myself, but I am interested in the technology so I am intrigued to learn more. The point of this post isn't to teach you all of the details about quantum computing (because I am far from qualified to do that). It's to teach you some of the basic concepts you should go learn about and point you to resources on where to learn more. For my current limited quantum computing knowledge, I have learned from resources from Wikipedia, IBM, and Microsoft. Google has some resources too, but I think most of them assume you already have a PHD in Physics or Computer Science. I found that some of the resources do a better job at explaining concepts than others. Since this is a complex topic, you can learn something from reading each of them. I will warn you, that this stuff is complicated. It's got to be if quantum computing is going to be as revolutionary as predicted. With a Computer Science degree, I learned about a lot of these concepts while at University, but I have long forgotten these things. Let's face it, you don't use things like Linear Algebra much when you are creating an Intranet in Office 365.
What follows below is some of the key areas of quantum computing that I think you will want to learn more about. There's a lot to learn about quantum computing, but this should give you some good building blocks.
What is Quantum Computing?
The promise of quantum computing is that they will be able to solve computational problems in minutes or hours versus years on classical computer. It all starts with qubits and how they interact with each other. To understand more, start by reading an Introduction in the Beginner's Guide of the IBM Q Experience. Then go read Microsoft's take on it. They also have a short video about it. Finally, if you want to go deep, read up on Wikipedia.
Who are the players?
IBM, Google, and Microsoft have all been in the media a lot lately with announcements. If we were to compare this to the space race, IBM has someone in orbit, Google is about to plant a flag on the moon, and Microsoft just decided what type of rocket fuel to use. I think Microsoft picked a compelling type of rocket fuel though, topological quantum computing. They have a world-class team working on it too. D-Wave Systems was the first company to have a commercially viable quantum device but it's limited to specific scenarios. This is not to discount the teams of university researchers that these companies have partnered with throughout the world as well.
One company you may notice that is sitting on buckets of cash but is remarkably missing is Apple. Apple has not announced any research in quantum computing at this point. There are a lot of theories on this, but since Apple can't make a quantum computer "pretty" and overcharge for it, I don't think it's ever going to happen.
Greek Letters
If it's been a while since you've looked at the Greek Alphabet, you better go do so now. Due to the amount of linear algebra involved, there are Greek letters everywhere, and while you might remember Alpha or Beta, remembering Psi, Phi, and Theta might be a bit harder. When you are reading through the documentation, pull up the article on Greek Letters on Wikipedia and keep it handy as a reference.
Linear Algebra
To understand the world of quantum computing, you need to understand several linear algebra concepts. Specifically, you need to understand the concept of vectors and matrices. You'll want to understand their notation and the operations on them. For example, you'll need to understand how to add and multiply matrices together and the tensor product. The Vectors and Matrices article in Microsoft's quantum computing concepts section is quite good. The following section on Eigenvalues and Eigenvectors is important as well. They also recommend the book Linear Algebra (additional materials) Third Edition by Jim Hefferon (available for free at the links). If you've never studied linear algebra (or it's been a while), it might be worth reviewing that book as you get into things.
Qubits
Whereas bits are the basic object of classical computing, qubits are the fundamental object of information in the quantum computing world. However, they are significantly complex. Like a bit, they can have a value of 0 or 1, but both at once through a process called superposition. We can visually represent a single qubit with something called a Bloch Sphere. This sphere has a radius of one with an X, Y, and Z axis. This may not make sense yet, but it actually helps us later when we are trying to understand how gates transform a qubit later. IBM has a pretty good description of Qubits and the Bloch Sphere. However, it made more sense to me, when I read through Microsoft's description.
Column vectors are the basis of representing qubits. When we perform operations on the qubit, we are effectively transforming those vectors (or matrices when there are multiple qubits). However, column vector notation can be a bit cumbersome at times. That's where Dirac notation comes in. In the Q Experience getting started guide, they talked about the 0 and 1 states being represented as |0> and |1>, but I didn't really pick up why there. Microsoft has a page on Dirac Notation that explains it quite well. When you start representing multiple qubits in Dirac notation, they will look something like |01> or |00>. However, it wasn't clear to me until I read the IBM documentation that the qubits should be read from right to left. That means the qubit on the right is the first qubit.
Operations / Gates
In a classical computer, there are only four functions (AND, OR, NOT, NAND) that map bits. In quantum computing, there are an infinite number of possible transformations on a single qubit. However, in reality, there are only a few that you deal with in these early examples. The gates are all named after mathematicians and theoretical physicists that have long past. Remember, we have known about quantum computing for some time. We just didn't know how to get there. When looking at the gates it may be easier to think of what the gate does when represented on the Bloch Sphere. There are gates known as the X, Y, and Z gates. They correspond to operations on the accesses of the sphere. For example, the X gate is thought of as a "bit-flip" where it is effectively flipping 0s and 1s in the matrix representation of the qubit. The Q Experience Beginner's Guide explains this gate fairly well. Throughout their guide, you can also click on a link to open the composer where you can actually try out these operations on a simulator and even their working quantum computers. The Y and Z gates do similar operations around their axis. If you want to get more into the math behind all of these operations, the Microsoft page has quite a bit of detail.
To put our qubits in a state of superposition, you use the Hadamard gate, often labeled as an H gate. To learn about the H gate, check out the Creating Superposition page on the Q Experience first. You'll use this gate a lot as you're starting to experiment.
Entanglement
When using the Controlled NOT (CNOT) gate, we can put multiple qubits into an entangled stage. What happens here is that when two qubits are entangled, when you measure one qubit it affects the other. This concept is a bit harder to grasp, so I've got a number of references. First, look at Microsoft's page on Multiple Qubits You can read a bit more from IBM. However, once I tried it out in a simulator (step 6), I think it made the most sense.
Experimenting with the IBM Q Experience
The best way to understand some of this is to try some experiments. I started with using the IBM Q Experience. It's been out for some time so I looked at it before Microsoft's Quantum Development Kit Preview came out this week. The Q Experience has a Composer, where you can visually drag and drop your gates onto qubits to run an experiment. There, it will show you which quantum computers are available and interesting facts should as what temperature the dilution refrigerator is running at.
You can drag and drop gates anywhere you like and then choose to Run or Simulate your results. This provides a visual representation of the underlying code going into the quantum computer. You can click on Switch to QASM Editor and you will see the code. If it looks like classical assembly language to you, that's because it's very similar. Remember, we're just dealing with qubits and gates here.
You can sign up for an account with the Q Experience and it will give you a fixed number of executions each day on an actual quantum computer. IBM current has 5 qubit and 16 qubit quantum computers in the Q Experience. Right now, it looks like the 16 qubit machine is offline because it's not showing any longer. Sometimes the Run button won't be available at all though. This all depends on the availability of the quantum computers as they take them down for maintenance regularly.
Just launching the composer is a bit daunting though. Instead it's easier to start with existing experiments throughout the Beginner's Guide. For example, on the CNOT page, you can run a variety of pre-configured examples in the Composer. This is a great way to learn quickly and actually try your results on a real quantum computer.
Experimenting with the Microsoft Quantum Development Kit Preview
This week, Microsoft released the Quantum Development Kit Preview. Microsoft doesn't have a quantum computer yet, but they have quite the development stack already. Using your own computer (and later Azure), you can simulate a quantum computer in Visual Studio 2017. Installation isn't too hard but you need to do it on actual hardware (not a VM). I think some Virtual Machine hosts do support the necessary CPU features if enabled though. Let's be clear though, simulating a quantum computer is CPU intensive and can be slow depending on the complexity of your algorithm.
I'll be posting a detailed walk through of the development kit pretty soon so I won't go into a lot of detail here today. Once you have the kit installed, go through the Quickstart. It walks you through a program from the ground up. To date, this has helped me learn some of the most about quantum computing yet.
When will quantum machines be commercially available?
We are starting to see a number of successful prototypes in the works now. From last I read, IBM is targeting 2021 and Microsoft says around "five years", so 2022. I suspect Google may be closer than that. That may seem pretty far off but it's really not in the scheme of things. The field of quantum computing actually started in the 1980s and it has taken us this long just to get where we are.
Why should you care?
Quantum computing isn't for everyone. That's true. However, once we enter a post-quantum computing era (and we will), the benefits will eventually affect you. I'm picturing a Y2K style gold-rush caused by quantum computing in the next ten years. I absolutely believe in quantum computing and while I'll never be capable of contributing directly to the research, I think I can help evangelize it.
I also recommend watching this keynote from the Microsoft Quantum team at Future Decoded. It has some great visual representations of quantum computing as well as what industries it may make an impact in.
Summary
When it comes to quantum computing, there is a lot to learn. You are not going to learn it all in a day. When you are reading about quantum computing, it's easy to get lost. That's ok though. I've read several of these pages multiple times, and I pick up a little more each time. I'll keep this post updated as I find other good resources.
Finally my usual disclaimer. I'm far from an expert in the field of quantum computing. If I got something wrong here, kindly correct me in the comments as opposed to trolling me.