# On the Differences Between Classical and Quantum Circuits

**On the differences Between Classical and Quantum Circuits**

**What They Are, Why They Matter, How We Can Understand Them**

*Do you think that Quantum Computing is an important emerging technology that we will be able to leverage to accelerate the development of new, advanced and ground-breaking therapies that could possibly save billions of human lives?*

In future, Quantum Computing could become a ground-breaking technology for Pharma and BioTech companies. For example, Quantum Computing could help biologists to greatly accelerate drug discovery, speeding up calculations related to protein folding. The reasons for this will become evident during this interview.

Folding is the physical process by which a linear polypeptide folds into its characteristic and functional three-dimensional structure. The three-dimensional structure of a protein determines how it interacts with other molecules within the body. Linear polypeptides are usually composed of sequences of many amino acids and are important because they determine how a polypeptide folds through processes of free energy minimization. With x amino acids, the number of final possible protein structures is 3x and therefore the number of possible structures a protein can have grows exponentially as a function of the number of amino acids used to form the polypeptide.

Protein folding is computationally heavy and very time consuming to solve with Classical Computing. This is because of the sheer number of possible protein structures biologists would need to explore and analyse to determine the ones that are more efficacious when interacting with the human body. Given a number and type of amino acids as input, Quantum Computing could help to generate and analyse the efficacy of all the possible protein structures ‘simultaneously’, rather than ‘sequentially’ as with Classical Computing. Through Quantum Computing, one could essentially simulate specific free energy minimization processes to create drugs containing proteins with just the right structure (i.e., the most efficient and efficacious structure to cure specific diseases).

*Why do you think Quantum Computing is so difficult for Biologists to understand and how can we help them to better comprehend this field and the implications that it could have in their future ways of working?*

Biologists usually do not have a strong Computer Science or Physics background and so for them, it is usually difficult to understand Quantum Computing. We can help them to better understand this field by pairing them up with external Quantum Computing experts that can be hired as consultants and contract workers.

There is an educational component to the whole process that is very important to start as soon as possible. Biologists will probably never have the time to learn to code or to earn a degree in Computer Science or Physics. However, the earlier some domain knowledge experts can help biologists to grasp at least the foundational concepts of Quantum Computing, the faster the interactions between the Quantum and Bio worlds can start to produce interesting and beneficial results.

The most important implication for biologists is that by using Quantum Computing, they will be able to solve very complex problems much faster than they are today (exponentially faster than through Classical Computing). Biologists would need to explain the challenges they are facing to Computer Scientists. Computer Scientists can then figure out which of the biologists’ problems are likely to yield the greatest benefits (i.e., the fastest accelerations) from Quantum Computing and can create new Quantum Computing algorithms for such problems. Furthermore, physicists can provide advice as to when they think the Quantum hardware technological infrastructure will be mature and stable enough to run such new algorithms at scale.

*What are the fundamental differences between Quantum and Classical Computing and how do you think we can explain them easily to people in the Life Sciences domain that may not have a Computer Science, Mathematics or Physics background?*

To understand how quantum computers fundamentally differ from the classical computers and servers that we work with or interact with from day to day, it is perhaps best to first break things down conceptually into their smallest parts.

**Bits vs Qubits:**

Within classical computers, a bit is the most basic unit of information, where each bit can take on a value of ‘1’ (on) or ‘0’ (off). Combinations of 1s and 0s are used to store larger pieces of data, as well as to collectively dictate sets of instructions for a computer to interpret and carry out.

Within a classical circuit, the value of a bit can be changed via a logic gate. Examples of logic gates include the NOT gate which simply converts the incoming bit to the opposite value (1 => 0 or 0 => 1), and the AND gate which takes two bits as an input and in turn outputs a ‘1’ if and only if both input bits have a value of ‘1’. Otherwise, the AND gate outputs a ‘0’ (11 => 1; 10 => 0; 01 => 0; 00 => 0).

For quantum computers, a qubit represents the most basic unit of information. Qubits are similar to a classical bit in that their values can correspond to arbitrary ‘1’ and ‘0’ values. However, qubits fundamentally differ from bits in that they act in a quantum way and are thus subject to various quantum phenomena.

**Quantum Behaviour:**

But what does it mean for something to act in a ‘quantum’ way? When working with things that are very small, the physics appears to differ vastly from what we see around us on a daily basis at the macro level. When left alone, electrons behave as probability waves that can interfere with each other, analogous to waves rippling out from stones dropped in a puddle. Much like the adjacent ripples in a puddle formed by two dropped stones, two electrons’ probability waves can ‘add up’ or ‘cancel’ each other out at different points of ‘contact’ between their respective waves to form maxima and minima associated with typical wave interference. Each electron’s probability wave can even interfere with itself. At this point, one might be confused, as electrons are so often portrayed as little balls that orbit around the nuclei of atoms, not as waves. The truth, however, is that electrons are also particles.

When a successful observation of an electron takes place, its probability wave or wave function collapses, causing it to stop acting as a wave and to instead exist as a particle. The probability wave dictates the probability of measuring the electron at a particular location. To put it another way, an unobserved electron that is confined to a box will exist as a probability wave which spreads throughout the box. Measurement of the electron’s position will collapse this probability wave, and the electron will be in a definite position somewhere in the box. Repeatedly preparing the electron in the same wave state and measuring its location as described above will result in finding the electron in locations throughout the box with probability exactly given by the probability wave. This portrays the idea of particle wave duality. Furthermore, we can put the electron in two states within the box simultaneously! We can put the electron in a lower and higher frequency probability wave within the box at the same time. This is known as a state of superposition. Specifically, we can put the electron in the low frequency state with probability x and the higher frequency state with probability y where x + y = 1. We will discuss below how quantum computers take advantage of superposition and perform computations by manipulating the probabilities x and y with gate operations.

**Quantum Superposition and Entanglement:**

Another fascinating example of quantum behaviour is entanglement. It is possible for a pair of particles to be prepared in such a way that each of their states are unknown, yet in some way inherently tied to each other. When a measurement of one particle’s state is made, both of their wave functions end up collapsing and each of their states is determined. In other words, when one particle is observed and its state is measured, the state of the second particle is effectively measured as well. This phenomenon is true across any distance in space and is instantaneous.

Let us say for instance that person A will measure one electron’s state at one end of the universe that is entangled with another electron that person B will observe at the other end of the universe. Person A could measure the state of their electron, which would in turn collapse the wave function of both their own electron and person B’s electron. By simply measuring the state of their electron, person A can thus instantly infer the state of person B’s electron, without even having to measure or look at person B’s electron. The opposite case is of course also true, in which person B can collapse and infer the state of person A’s electron by measuring the state of their own.

Notably, the current world record for performing such an experiment is held by China. Scientists there beamed entangled photons from a satellite to two ground stations 1,203 kilometres apart and experimentally confirmed the physics of entanglement at this distance. Importantly, while the probability wave collapses instantaneously, information cannot be sent this way. The uncertain initial state of the particles and their subsequent random collapse into the different states means that only purely random data (i.e., not actual useful or encoded information) can be instantly sent or received. Additionally, should persons A and B attempt to compare their lists, they would find that they are perfectly matched within a small experimental error. However, the fastest that, for example, A could send B their list is the speed of light. So, this phenomenon does not contradict the physics that information cannot be sent faster than the speed of light.

There are many other intriguing quantum phenomena that exist, including the ability for one particle to exist in two different places at the same time or even for a particle to seemingly teleport (tunnel) through an otherwise impassable barrier. However, superposition and entanglement are perhaps the most important phenomena when it comes to understanding the nature of qubits and quantum circuits.

**Quantum Circuits:**

However, unlike classical bits, a qubit can also be put into a state of superposition in which the qubit is in both a ‘1’ state and a ‘0’ state at the same time. When a qubit in such a state is measured, its wave function collapses and its state is once again observed to correspond to either ‘1’ or ‘0’ (with associated probabilities for each state). The quantum gate that evenly splits qubits into such superpositions (with subsequent equal measurement probabilities for each state) is known as the Hadamard gate.

Qubits can then be entangled with each other through the presence of additional gates known as CNOT gates. In a CNOT gate, the value of a control qubit determines whether or not a second qubit is flipped (11 => 10; 10 => 11, 00 => 00, 01 => 01). By combining a Hadamard gate with a subsequent CNOT gate, a pair of qubits can be put into a state of maximum entanglement, known as a Bell state. There are of course many other quantum gates such as the I (Identity) gate, the Pauli-X (NOT) gate, the Pauli-Y (π-radian rotation) gate, the Pauli-Z (phase-flip) gate, the SWAP gate and the CSWAP (Fredkin) gate.

Some quantum gates are of course comparable to specific classical gates, an example being the quantum Pauli-X gate and the classical NOT gate. However, for the most part, quantum circuits consist of numerous gates which are impossible to replicate within classical circuits.

Accordingly, numerous circuits consisting of combinations of qubits and quantum logic gates have been mapped out to present ground-breaking new algorithms that in theory demonstrate quantum advantage (the ability for a quantum computer to outperform a classical computer at a given task such as protein folding). Many of these algorithms use the concept of quantum parallelism, but that is a topic we will leave for another interview.

**Bios:**

Jonathan has a first class honours integrated master’s degree in Natural Sciences from Lancaster University, having studied a combination of physics, biochemistry and molecular biology.

Captivated by the areas of quantum mechanics and genetics, he sought to pursue them in a way that matches the demands of the modern informational era, thus undertaking final year modules such as Bioinformatics and Quantum Information Processing, as well as interdisciplinary modules like Biology of Living Systems and Computer Modelling.

Having finished his degree, Jonathan completed a graduate data engineering training course at Rockborne, an expert provider of specialist Data & Analytics consultants. Throughout his time there, Jonathan gained extensive experience in working with real-world data from both an engineering perspective and from a more exploratory ML & AI perspective, while also working on some personal ML projects in his free time.

Eager to pursue a blend of his greatest academic interests, Jonathan is now working with GSK alongside Fausto, Kevin and Professor Stefan to explore potential future applications of quantum computers and quantum classification algorithms within the genetic, genomic and chemistry spaces.

Dr. Bekiranov received his BSEE in electrical engineering at UCLA. He worked as a microwave engineer at Raytheon Electromagnetic Systems Division in Santa Barbara. He received his PhD in theoretical condensed matter physics from the University of California at Santa Barbara (Advisors: Walter Kohn and Philip Pincus) and went on to do a postdoc in statistical/condensed matter physics at the University of Maryland (Advisor: Michael E. Fisher) followed by a postdoc in computational biology at The Rockefeller University (Advisors: Eric D. Siggia and Terry Gaasterland). He pioneered the analysis of high-resolution genomic tiling array data (ChIP-chip, RNA-chip, Repli-chip) as a Bioinformatics Staff Scientist at Affymetrix. He is now an Associate Professor at the University of Virginia School of Medicine working on development of classical and quantum machine learning approaches for biomedical applications and has published over 80 papers in peer-reviewed journals.

Fausto has a double PhD (Information Technology and Computer Science), earning his second master’s and PhD at the University of California, Irvine. Fausto also holds multiple certifications from MIT, Columbia University, London School of Economics and Political Science and soon also from Kellogg School of Management, University of Cambridge and University of California, Berkeley. He has worked in multi-disciplinary teams and has over 20 years of experience in academia and industry.

As a Physicist, Mathematician, Engineer, Computer Scientist, and High-Performance Computing (HPC) and Data Science expert, Fausto has worked on key projects at European and American government institutions and with key individuals, like Nobel Prize winner Michael J. Prather. After his time at NVIDIA corporation in Silicon Valley, Fausto worked at the IBM T J Watson Center in New York on Exascale Supercomputing Systems for the US government (e.g., Livermore and Oak Ridge Labs).

Kevin graduated with a BSAE in aerospace engineering from Pennsylvania State University. During his collegiate career he gained experience in a co-op position with Capital One Financial as a Data Analyst at in Richmond, Virginia. It was this experience that afforded him the opportunity to find passion in data munging, applied statistics, and programming. Following graduation, he accepted a full time offer in their newly formed Digital Enterprise Organization, expanding his technical and analytical knowledge in areas such as distributed computing, clickstream analytics, multi-variate testing, anomaly detection, and propensity modeling.

Forever pursuant in new-found knowledge, Kevin made a switch to Software Engineering at GSK to lead the Enterprise DevSecOps Organization. There he was able to strategize good development practices for the company, as well as ideate and design technical solutions to improve the development lifecycle by automating manual release tasks. After a few years in the role he joined the Emerging Tech team in GSK R&D to run modern data platforms providing data streaming, Machine Learning batch training and inference, and unified authorization; as well as engaging in future Use Cases like the ones related to Quantum Computing.