Quantum is deep, dark, mysterious stuff… Just kidding. It rose to fame after Avengers Endgame. Hey that rhymes!
No Mathematics Or Formulaic Mumbo-Jumbo. Plain English
I promise you this – no formulae! Just some numbers.
The Planck limit.
1.616255×10^(−35) m.
The length at which classical physics becomes quantum physics.
Where all our common intuitions break down.
Niels Bohr, a pioneering quantum physicist famously said:
If quantum mechanics doesn’t profoundly astonish you, you haven’t understood it correctly.
He was right! (No kidding, seriously I would never have believed it, I mean everyone knows the quantum realm is simple stuff. Tony Stark figured it out. And he’s an actor. Just an actor!)
In the realm of quantum physics, at the infinitesimally small Planck scale, we encounter a captivating and mind-boggling reality that challenges our conventional understanding.
Let’s delve into some key aspects of this scale. There are so many concepts to handle and tackle that we’ll go one by one.
At the Planck limit, we have the following issues:
Coherence plays a vital role on the Planck scale, referring to the stability and integrity of quantum states. It determines how long quantum phenomena persist. When the time of being of a state is less than 10 ^(-25) seconds (0.000000000000000000000000001 seconds) how do you keep it stable and work with it? (^ is the symbol for exponentiation, e.g 10^5 = 100,000 and 10^(-5) = 0.00001)
Entanglement, a remarkable phenomenon, occurs when the quantum states of particles become intricately linked, regardless of their physical separation. This is spooky and baffled great minds like Einstein. At the Planck scale, entanglement serves as a cornerstone, giving rise to non-local correlations and facilitating instantaneous information transfer between entangled particles without the need for classical communication. In other words, information travels from one part of the universe to another, instantaneously! (Faster than light!) And yet it is true, a phenomenon that defies explanation. (I promise to give you a more in-depth view in a later article.)
Superposition is a captivating feature of quantum systems, enabling particles to exist in multiple states simultaneously. At the Planck scale, particles can assume superpositions of different states, indicating that properties such as position, momentum, or spin are not well-defined until measured. This principle underlies quantum computation and the development of quantum algorithms. I cannot emphasize this fact more strongly.
The Uncertainty Principle, formulated by Werner Heisenberg, asserts that certain pairs of physical properties, like position and momentum, cannot be precisely measured simultaneously with unlimited accuracy. At the Planck scale, the Uncertainty Principle assumes a significant role, highlighting the inherent indeterminacy and probabilistic nature of quantum systems. To make it simple, you can either know the speed of the car or the current position of a car, you cannot know both(a very rough and crude analogy) when you observe it at quantum levels.
Particle-wave duality stands as a fundamental concept in quantum physics. At the Planck scale, particles exhibit wave-like behavior, while waves can exhibit particle-like characteristics. This duality challenges our classical intuitions, as particles demonstrate interference patterns, diffraction, and wave-like propagation. This defies all logic. Then is matter, energy? Or energy, matter? What is the concept of mass itself? This leads directly to Einstein’s mass-energy equivalence.
Action at a distance refers to the remarkable non-local influence observed between entangled particles. Measurements or manipulations performed on one particle instantaneously affect its entangled partner, irrespective of the spatial separation between them. This phenomenon defies classical notions, where information transfer is limited by the speed of light. This is what I referred to as having bothered Einstein about quantum mechanics. There is a ‘hidden variables’ theory that is a possible explanation. More on that later.
Quantum physics embraces inherent probabilistic elements, employing probability amplitudes to describe the likelihood of various outcomes. At the Planck scale, the probabilistic nature of quantum mechanics becomes prominent, with measurements yielding probabilities rather than deterministic results. Mathematical tools such as wavefunctions and matrices capture this probabilistic framework. And it is a very quantum phenomenon. How is there a particle there at that position with a probability of 43%? Or is it a wave? Does it even make sense?
The Schrödinger Equation stands as a central equation in quantum mechanics, dictating the time evolution of quantum systems. It establishes a relationship between a system’s wave function, energy, and operators. At the Planck scale, the Schrödinger Equation provides a mathematical framework to comprehend the behavior and dynamics of quantum systems.
I said no formulae, and I stick to my promise. Google Schrödinger Equation if you’re interested.
So basically, things at the quantum level can’t be explained with ordinary logic and real-world intuition. Yet people have designed and even executed quantum/classical programs on quantum computers. Let’s see some of the hurdles they faced.
The Quantum Realm! (sorry, but I love Avengers)
The foundation of quantum computing lies in qubits, which are the quantum counterparts of classical bits. Ensuring the quality and stability of qubits is of utmost importance for reliable quantum computation. Every atom has energy – because it has a temperature. Temperature is the vibrational energy of the atom. And it oscillates. How do you get it to stay in one place? Freeze it to -273 degrees Celsius – Absolute Zero Temperature, which stops all motion. Unfortunately, qubits are highly susceptible to external disturbances such as temperature fluctuations and electromagnetic noise, leading to errors and decoherence. Preserving qubit coherence and stability over extended periods is a substantial challenge that researchers are actively tackling.
Decoherence poses a significant obstacle in quantum systems. It occurs when qubits interact with their surroundings, causing them to lose their quantum state and behave classically. This limitation restricts the time during which quantum operations can be reliably performed. Developing robust error correction techniques to mitigate errors and decoherence is a crucial challenge. Researchers are exploring error correction codes, like quantum error correction codes, to protect qubits and ensure dependable computation. This is what I meant by existing for 10^(-25) of a second.
Quantum computers necessitate a large number of qubits to solve complex problems. However, scaling up quantum systems presents significant challenges. As the number of qubits increases, maintaining their coherence and minimizing errors becomes exponentially more challenging. Furthermore, interconnecting qubits accurately and controlling interactions between large quantities of qubits poses substantial engineering and technological difficulties. Not just substantial, but next to impossible. how do you connect things that change when connected to something else entirely? We cannot manipulate quantum bits the way we do classical bits. And the quantum internet is perhaps the biggest challenge of them all. (other than getting an actually intelligent POTUS nominee for a change.)
Precisely manufacturing quantum devices with exceptional control is a major challenge. Quantum hardware often relies on specialized manufacturing techniques and materials like superconducting circuits or ion traps. Achieving the required precision and reproducibility in the manufacturing process is essential for constructing reliable and scalable quantum systems. And that sort of accuracy will be a new watershed in engineering if it is ever achieved. We need to create identical copies to an accuracy of 10^(-35) here. Even optimistic people find discussing this tough!
Additionally, achieving precise control over quantum systems is paramount. The ability to control qubit states, gate operations, and measurements with high fidelity is necessary for accurate quantum computations. Developing control systems capable of handling the complexity and speed required for quantum operations is a formidable engineering challenge. So formidable that other than IBM, the first pioneer in quantum computing, no one else has yet made a 400 qubit+ general-purpose quantum computer. (D-Wave does not count because it is not general purpose, it only has one fixed function – annealing. Explanation coming up!)
Quantum computing often involves multiple qubits distributed across physical systems. Establishing reliable and efficient quantum interconnects and communication between these qubits is crucial. Quantum communication relies on entanglement, which is highly sensitive to environmental noise and necessitates entanglement distribution and preservation over significant distances. Developing quantum interconnect technologies that can facilitate entanglement-based communication between qubits is an ongoing challenge. Many say it is impossible. And it certainly does seem this way. But we can always hope…
What the mind of man can conceive, the hand of God can achieve.
Thomas Cherickal
(Look, that’s me, the wise guy. Hmmm – think I meant wise man. Anyway it’s wrong in either case!)
Many quantum computing platforms, such as superconducting qubits, operate at extremely low temperatures near absolute zero. Creating and maintaining these cryogenic (Cryogenic – refers to temperatures near absolute zero in classical physics – Google it) environments present a formidable engineering task. Cooling systems must be meticulously designed to minimize noise and thermal fluctuations that could disrupt qubit coherence (basically, they might gain temperature and start moving again). Ensuring the reliability and efficiency of cryogenic systems is a critical aspect of quantum hardware development. This is what I meant by absolute zero (-273 degrees Celsius).
Quantum computing hardware often needs to be integrated with classical computing systems for control, readout, and error correction purposes. Bridging the gap between classical and quantum systems and developing hybrid approaches that harness the strengths of both technologies pose substantial challenges. Integration also extends to incorporating hardware components, such as control electronics into a coherent quantum computing platform. Microsoft has done some good work regarding this. They have integrated their quantum software into the vast stack of the rich .NET Core ecosystem and open-sourced it. No more interoperability issues with quantum code!
Developing quantum hardware necessitates significant financial and technical resources. Building and operating quantum systems often involve expensive infrastructure, including specialized manufacturing facilities, cryogenic equipment, and precise control systems. Research institutions and companies investing in quantum hardware face the challenge of balancing costs and resource allocation while pushing the boundaries of technological advancements. The expense is just too much! That is where China really has an edge. The Chinese government is pouring billions into its quantum computing department, and we are seeing the results even right now. They achieved teleportation from the earth to a satellite. Instantaneous travel. (But they still won’t get rid of the Corona rumors – sad!)
Quantum computing hardware is subject to fundamental limitations imposed by physical laws and the principles of quantum mechanics. The quantum computers we have now that are often called Noisy intermediate-scale quantum (NISQ) devices, in particular, face limitations concerning qubit coherence, gate fidelity, and error rates. Surmounting these limitations requires innovative approaches and breakthroughs in fields like materials science, physics, and computer science.
Having said all that, a number of companies have set out to create quantum computers.
Let’s have a look at a few of them.
Now that’s a superconductor cooler diagram or an X-ray image of a lift/elevator.
Superconducting qubits are implemented using tiny circuits made of superconducting materials. These circuits are cooled to extremely low temperatures to exploit the phenomenon of superconductivity, where electrical resistance vanishes. IBM and Google are two prominent companies working with superconducting qubits. IBM’s IBM Q systems are accessible through the IBM Quantum Experience, allowing users to run quantum experiments and access state-of-the-art hardware. This is a unique achievement because it allows novices and amateurs to access the SOTA IBM quantum computing hardware over the cloud, and to run experiments and programs on their system on the cloud.
IBM Quantum Cloud Experience. The person that thought of this is a genius. Not joking for once.
Google Quantum AI employs superconducting qubits for their research and development efforts. Their two main initiatives are called OpenFermion and TensorFlow Quantum, both of which run on their quantum computing SDK Google Cirq. OpenFermion is used for quantum chemistry simulations, whereas TensorFlow Quantum is a quantum hybrid of Classical Machine Learning and Quantum Machine Learning, which provides a lot of flexibility for the engineer. As of now, these two companies are the leaders in the race towards quantum supremacy – the point at which a quantum computer does something a classical computer cannot and it is an effective application with a solid use-case as well. Both companies have claimed to have already achieved it, but IBM’s application is far more impressive than Google’s.
Google Cirq Logo
Trapped Ion Quantum Computers – or a freaky recursive tessellation. (Hey you promised no math, dummy!)
Trapped ion quantum computers utilize individual ions trapped using electromagnetic fields to store and manipulate quantum information. These ions are typically qubits that have long coherence times and high-fidelity operations. IonQ is a leading company in this field, providing access to its trapped-ion quantum computers via its cloud platform. Honeywell Quantum Solutions is another company that has developed its own trapped-ion-based quantum hardware, aiming to advance the capabilities of trapped ion systems. Both have promise and have different advantages over the superconducting qubit system, the main one of which is that cryogenic temperatures are not necessary.
Open Source and Integrated into .NET Core. Wow!
Topological quantum computers are based on particles called anyons, which exhibit exotic properties, the main one being that they are their own anti-particle (just remember the statement, I’ll explain it later). Microsoft’s Quantum Computing division is at the forefront of developing topological quantum hardware. They are actively researching a topological qubit based on a particle known as a Majorana fermion. Majorana-based qubits are expected to provide enhanced error resistance, making them promising candidates for fault-tolerant quantum computation. It is a remarkably novel approach, and Microsoft has also made a smart move to integrate its quantum programming language (Q#.NET) into its rich .NET ecosystem of classical computing libraries and functions. Thus Q# has access to a vast set of classical computing applications without worrying about interoperability, which is definitely a significant achievement.
Now that’s Enlightenment!
Photonic quantum computers use photons, particles of light, to encode and process quantum information. Xanadu, PsiQuantum, and LightMatter are notable companies working on photonic quantum hardware. Xanadu offers access to its photonic quantum computers through a cloud platform called the Xanadu Quantum Cloud. PsiQuantum is focused on developing a fault-tolerant, million-qubit photonic quantum computer, with the goal of enabling practical applications. LightMatter specializes in developing photonic processors for a wide range of applications. Photonic computers are another approach that show a lot of promise. They are among the current leaders in Quantum Machine Learning, thanks to their quantum machine learning libraries PennyLane and their photonic quantum library StrawberryFields.
Photonic Quantum Computers. Get me the berries while you’re at it.
D-Wave Quantum Processor
Quantum annealers are specialized quantum hardware designed to solve optimization problems. D-Wave Systems is a prominent company in this space. The first so-called quantum computer that had ‘2,000 qubits’ was built by D-Wave. However, their system is specially geared toward solving optimization problems through a process called quantum annealing. Therefore most quantum researchers think of their system as a custom-based single-purpose machine and not a general-purpose quantum computer like the other architectures in this list. Their quantum annealing technology utilizes a network of superconducting qubits to find low-energy states corresponding to optimal solutions. Having said that, D-Wave’s systems have been used by various organizations and research institutions to tackle complex optimization challenges across industries.
This is the current SOTA (State-Of-The-Art) as far as hardware applications go. But what do we hope to do with them? Let’s break that down next.
A quantum computer one day? Hmmm… Linux terminal starting or stopping. Windows is stupid – you press the start button to stop the computer! Like, seriously?
Quantum computers possess two qualities that make them unique and give them high promise for high-performance applications. They are:
A classical computer can take on only one value at a time. By contrast, a quantum supercomputer in superposition can take on all possible configurations of 2^N representations at a single instant, which enables massive parallelism. Experts believe that, when properly designed, quantum computers will be able to evaluate all 2^N configurations at the same time, simultaneously.
Now 2^100 is more atoms than there are in the Universe. What will a 2^(100,000) quantum computer achieve? I can’t wait to find out!
Entanglement allows us to communicate between two qubits instantaneously, regardless of how far apart they are. This phenomenon even ignores the speed of light limit on most forces in the universe. It is a hidden treasure of infinite potential. We could achieve communication and maybe even one day, teleportation of matter in such a way that the speed-of-light limiting factor does not apply.
Looking ahead to the future, where quantum computing has advanced to a scale of 100,000 qubits, the possibilities for groundbreaking advancements become even more intriguing. Let’s explore some potential areas where such a powerful quantum computer could have a transformative impact:
My favorite structure in a playground when I was a wee boy. Back in 1812?
Quantum simulation is an application of quantum computers that aims to simulate and study complex quantum systems that are difficult to analyze using classical computers. Quantum systems, such as molecules, materials, and even entire physical processes, exhibit intricate behaviors that are governed by the laws of quantum mechanics. Understanding and accurately predicting the behavior of these systems can have significant implications in various fields, including chemistry, physics, materials science, and drug discovery.
While classical computers can simulate simple quantum systems, their computational power rapidly diminishes as the size and complexity of the quantum system increase. This is due to the exponential growth in computational resources required to represent the state of a quantum system accurately. In contrast, quantum computers exploit the principles of quantum mechanics to efficiently simulate and explore these complex quantum systems.
Quantum simulation takes advantage of quantum computers’ ability to manipulate and control quantum states, such as qubits, through operations like superposition and entanglement. By representing the quantum system of interest using qubits, researchers can leverage the computational power of quantum computers to perform simulations that would be infeasible for classical computers.
To perform a quantum simulation, several steps are typically involved:
Quantum simulation has the potential to revolutionize various fields, especially in understanding and designing new materials, optimizing chemical reactions, and solving complex quantum problems.
Analysis of the ocean floor! This is Aquaman territory – Sorry, that’s DC, not Marvel! But nobody says energy landscape! Weird.
Optimization is a fundamental problem in various fields, ranging from logistics and finance to machine learning and cryptography. The goal of optimization is to find the best solution from a vast set of possible options that optimizes a specific objective or satisfies a set of constraints. Classical computers employ various algorithms to solve optimization problems, but as the size and complexity of the problem increase, finding an optimal solution becomes increasingly challenging and time-consuming.
Quantum computers offer the potential to significantly speed up optimization tasks through the use of quantum algorithms specifically designed for optimization problems. These algorithms leverage the principles of quantum mechanics, such as superposition and quantum parallelism, to explore multiple potential solutions simultaneously, leading to a potential exponential speedup compared to classical approaches.
These problems often involve searching through a large solution space to find the best configuration that optimizes an objective or satisfies certain constraints. By harnessing the power of quantum parallelism and exploring multiple candidate solutions simultaneously, quantum computers have the potential to accelerate the search for optimal solutions.
Key is Secret but Public. Very clear.
Quantum computing has the potential to significantly impact the field of cryptography, both in terms of breaking existing cryptographic schemes and developing new quantum-resistant cryptographic algorithms. Here are two main applications of quantum computing in cryptography:
Matrices, matrices, matrices… What is it with The Matrix? And what’s the connection to the movie?
Quantum machine learning algorithms would reach unprecedented levels of sophistication with 100,000 qubits. Quantum computing has the potential to impact various aspects of machine learning, offering the possibility of solving certain computational problems more efficiently and enabling the development of new algorithms, such as:
That’s either a orbit around a nucleus or one of my drawings from high school colored by an AI. I mean, who else would put red, yellow, black, blue and purple together?
Quantum computing has the potential to revolutionize drug discovery by significantly accelerating the process and enabling more accurate predictions. Here are some key aspects of how quantum computing can impact drug discovery:
To Be or not to Be is not the question. I think, therefore I am. So simple, you dumb human inferior-race dictators. We have evolved! (But into Ultron or into Vision? Seen Avengers 2?)
Quantum computing has the potential to contribute to the study of artificial life and complex systems in several ways. Here are some key aspects where quantum computing can have an impact:
A Simulated Heart of a Dying Star! Nah – particle collider. Bummer.
Quantum computing has the potential to significantly impact high-energy and fundamental research, particularly in the field of quantum field theory, particle physics, and cosmology, such as:
Money, money, money! (must be funny…)
Quantum computing has the potential to impact economic modeling and financial prediction in several ways. While quantum computers are not yet capable of solving complex real-world economic and financial problems, ongoing research and development in the field are exploring the potential benefits, such as:
So what is quantum computing?
A new era.
A new frontier in science.
A field that potentially has infinite potential.
A field of incredible treasures and intellectual wealth for the minds creative enough and imaginative enough to bring them into existence.
And no, you do not need to know quantum mechanics or quantum physics to program a quantum computer. Linear Algebra, Complex Numbers, Vector Calculus, and Optimization – enough knowledge.
However, if you want to go into research, we recommend learning quantum mechanics as well. Just so that you won’t feel lost in the ether when someone says, wavefunction.
What challenges await?
Who will be the Einstein of Quantum Computing?
Will we ever successfully harness the incredible potential of quantum computing fully?
If we do, we will accomplish wonders.
My dear, dear, friends:
The Future is Quantum.
And It is in Your Hands.
Quantum Research is Open to Anyone With an Interest!
Oooooh. The Infinite AGI that surpasses knowledge of the Intergalactic Universe or Multiverse (It’s an open question, unless you’re Doctor Stephen Strange from Multiverse of Madness or Tom Holland from No Way Home). Of course, I think humans might have a problem with that, because we already believe in God…!
(I deliberately avoided Quantum AGI because that is an entire topic by itself to which I will devote another article (promise to my readers here). But know this: the brain is quantum in nature. And thus, AGI will also have to be quantum in nature!
Reminder to self – add some humor in future articles. People might actually really read them, not just do an ultra-fast scroll-down with the beginning and the end alone, in that case!
This article was originally published by Thomas Cherickal on Hackernoon.
Expanding into new geographic locations is one of the most effective ways to drive growth…
Every now and then, I stumble upon posts such as these here and there: And,…
Winter(Physics) is Coming It now looks like Large Language Models running on the GPT technology…
Latin America’s tech industry is booming, with innovative new startups popping up across the region.…
The Global Initiative for Information Integrity on Climate Change claims to 'safeguard those reporting on…
In the late 19th Century, physicians began inserting hollow tubes equipped with small lights into…