By Diego F. Freire, of Levenfeld Pearlstein, LLC
The contemporary computer processor — at only half the size of a penny — possesses the extraordinary capacity to carry out 11 trillion operations per second, with the assistance of an impressive assembly of 16 billion transistors.[1] This feat starkly contrasts the early days of transistor-based machines, such as the Manchester Transistor Computer, which had an estimated 100,000 operations per second, using 92 transistors and having a dimension of a large refrigerator. For comparison, while the Manchester Transistor Computer could take several seconds or minutes to calculate the sum of two large numbers, the Apple M1 chip can calculate it almost instantly. Such a rapid acceleration of processing capabilities and device miniaturization is attributable to the empirical observation known as Moore’s Law, named after the late Gordon Moore, the co-founder of Intel. Moore’s Law posits that the number of transistors integrated into a circuit is poised to double approximately every two years.[2]
In their development, these powerful processors have paved the way for advancements in diverse domains, including the disruptive field of artificial intelligence (AI). Nevertheless, as we confront the boundaries of Moore’s Law due to the physical limits of transistor miniaturization,[3] the horizons of the field of computing are extended into the enigmatic sphere of quantum physics — the branch of physics that studies the behavior of matter and energy at the atomic and subatomic scales. It is within this realm that the prospect of quantum computing arises, offering immense potential for exponential growth in computational performance and speed, thereby heralding a transformative era in AI.
In this article, we scrutinize the captivating universe of quantum computing and its prospective implications on the development of AI and examine the legal measures adopted by leading tech companies to protect their innovations within this rapidly advancing field, particularly through patent law.
Qubits: The Building Blocks of Quantum Computing
In classical computing, the storage and computation of information are entrusted to binary bits, which assume either a 0 or 1 value. For example, a classical computer can have a specialized storage device called a register that can store a specific number at a time using bits. Each bit is like a slot that can be either empty (0) or occupied (1), and together they can represent numbers, such as the number 2 (with a binary representation of 010). In contrast, quantum computing harnesses the potential of quantum bits (infinitesimal particles, such as electrons or photons, defined by their respective quantum properties, including spin or polarization), commonly referred to as qubits.
Distinct from their classical counterparts, qubits can coexist in a superposition of states, signifying their capacity to represent both 0 and 1 simultaneously. This advantage means that, unlike bits with slots that are either empty or occupied, each qubit can be both empty and occupied at the same time, allowing each register to represent multiple numbers concurrently. While a bit register can only represent the number 2 (010), a qubit register can represent both the numbers 2 and 4 (010 and 100) simultaneously.
This superposition of states enables the parallel processing of information since multiple numbers in a qubit register can be processed at one time. For example, a classical computer may use two different bit registers to first add the number 2 to the number 4 (010 +100) and then add the number 4 to the number 1 (100+001), performing the calculations one after the other. In contrast, qubit registers, since they can hold multiple numbers at once, can perform both operations—adding the number 2 to the number 4 (010 + 100) and adding the number 4 to the number 1 (100 + 001)—simultaneously.
Moreover, qubits employ the singular characteristics of entanglement and interference to execute intricate computations with a level of efficiency unattainable by classical computers. For instance, entanglement facilitates instant communication and coordination, which increases computational efficiency. At the same time, interference involves performing calculations on multiple possibilities at once and adjusting probability amplitudes to guide the quantum system toward the optimal solution. Collectively, these attributes equip quantum computers with the ability to confront challenges that would otherwise remain insurmountable for conventional computing systems, thereby radically disrupting the field of computing and every field that depends on it.
Quantum Computing
Quantum computing embodies a transformative leap for AI, providing the capacity to process large data sets and complex algorithms at unprecedented speeds. This transformative technology has far-reaching implications in fields like cryptography,[4] drug discovery,[5] financial modeling,[6] and numerous other disciplines, as it offers unparalleled computational power and efficacy. For example, a classical computer using a General Number Field Sieve (GNFS) algorithm might take several months or even years to factorize a 2048-bit number. In contrast, a quantum computer using Shor’s algorithm (a quantum algorithm) could potentially accomplish this task in a matter of hours or days. This capability can be used to break the widely used RSA public key encryption system, which would take conventional computers tens or hundreds of millions of years to break, jeopardizing the security of encrypted data, communications, and transactions across industries such as finance, healthcare, and government. Leveraging the unique properties of qubits—including superposition, entanglement, and interference— quantum computers are equipped to process vast amounts of information in parallel. This capability enables them to address intricate problems and undertake calculations at velocities that, in certain but not all cases,[7] surpass those of classical computers by orders of magnitude.
The augmented computational capacity of quantum computing is promising to significantly disrupt various AI domains, encompassing quantum machine learning, natural language processing (NLP), and optimization quandaries. For instance, quantum algorithms can expedite the training of machine learning models by processing extensive datasets with greater efficiency, enhancing performance, and accelerating model development. Furthermore, quantum-boosted natural language processing algorithms may yield more precise language translation, sentiment analysis, and information extraction, fundamentally altering how we engage with technology.
Patent Applications Related to Quantum Computers
While quantum computers remain in their nascent phase, to date, the United States Patent and Trademark Office has received more than 6,000 applications directed to quantum computers, with over 1,800 applications being granted a United States patent. Among these applications and patents, IBM emerges as the preeminent leader, trailed closely by various companies, including Microsoft, Google, and Intel, which are recognized as significant contributors to the field of AI. For instance, Microsoft is a major investor in OpenAI (the developer of ChatGPT) and has developed Azure AI (a suite of AI services and tools for implementing AI into applications or services) and is integrating ChatGPT into various Microsoft products like Bing and Microsoft 365 Copilot. Similarly, Google has created AI breakthroughs such as AlphaGo (AI that defeated the world champion of the board game Go), hardware like tensor processing units (TPUs) (for accelerating machine learning and deep learning tasks), and has released its own chatbot called Bard (also known as LaMDA).
Patents Covering Quantum Computing
The domain of quantum computing is progressing at a remarkable pace, as current research seeks to refine hardware, create error correction methodologies, and investigate novel algorithms and applications. IBM and Microsoft stand at the forefront of this R&D landscape in quantum computing. Both enterprises have strategically harnessed their research findings to secure early patents encompassing quantum computers. Notwithstanding, this initial phase may merely represent the inception of a competitive endeavor to obtain patents in this rapidly evolving field. A few noteworthy and recent United States patents that have been granted thus far include:
- Google’s patent titled “Measurement based uncomputation for quantum circuit optimization” (US Patent No. US 11,030,546 B1). This patent, granted on June 8, 2021, aims to address the optimization of quantum circuits to increase computational efficiency. The proposed method involves identifying sequences of operations that un-compute qubits in the quantum circuit and replacing them with an X basis measurement and a classically-controlled phase correction operation. This approach allows for reduced computational complexity, efficient uncomputation of table lookup operations, and the potential to free up qubits for other operations without requiring ancillary qubits from other sources.
- EQUAL1 LABS INC’s patent titled “Quantum shift register based ancillary quantum interaction gates” (US Patent No. US 10,562,764 B2). This patent, granted on February 18, 2020, aims to address the challenges in quantum computing related to isolating microscopic particles, loading them with information, enabling their interaction, and preserving the result of their quantum interaction. Additionally, it addresses the challenges of operating at extremely low temperatures and dissipating significant power needed to run quantum machines. The patent claims a quantum shift register with ancillary functions, which consists of multiple quantum dots and control gates fabricated on a semiconductor substrate. The register allows for controlled transportation of particles in a quantum structure and provides ancillary functions, such as double interaction and bifurcation. This allows the quantum state of one pair of quantum dots to be replicated in another pair. Control is achieved through electric control gate pulses and an optional auxiliary magnetic field, aiming to improve the scalability and efficiency of quantum computing systems.
- Microsoft’s patent, titled “Coherent quantum information transfer between conventional qubits” (US Patent No. US 9,152,924B2). This patent, granted on October 6, 2015, describes a quantum bus that can enable coherent transfer of quantum information between conventional qubit pairs by measuring their joint parity using the Aharonov-Casher effect and an ancillary superconducting flux qubit. This can facilitate the production of maximally entangled qubits, allowing for quantum state teleportation between quantum systems.
Conclusion
Quantum computing signifies a monumental leap forward for AI, offering unparalleled computational strength and efficiency. As we approach the limits of Moore’s Law, the future of AI is contingent upon harnessing qubits’ distinctive properties, such as superposition, entanglement, and interference. The cultivation of quantum machine learning, along with its applications in an array of AI domains, including advanced machine learning, NLP, and optimization, portends a revolution in how we address complex challenges and engage with technology.
Prominent tech companies like IBM and Microsoft have demonstrated their commitment to this burgeoning field through investments and the construction of patent portfolios that encompass this technology. The evident significance of quantum computing in shaping the future of AI suggests that we may be witnessing the onset of a competitive patent race within the sphere of quantum computing.
[1] For example, the Apple M1, announced in 2020.
[2] Multicore processors and GPUs have helped extend Moore’s Law by incorporating more processing units on a single chip, effectively increasing the overall transistor count and computational power. As traditional methods of scaling down transistors face physical limits, leveraging parallelism through multiple cores and specialized GPU architecture has become a key factor in maintaining the growth of computing performance.
[3] Physical limits include electron leakage (caused by quantum tunneling, where electrons can “tunnel” through barriers that they should not be able to pass through, causing increased power consumption and possible errors in computation) and heat dissipation (which occurs because, as transistors become smaller and more densely packed, it becomes increasingly difficult to dissipate the heat generated during their operation, which can lead to overheating and performance degradation).
[4] Enabling the efficient breaking of classical cryptographic schemes while simultaneously driving the development of quantum-resistant algorithms and secure quantum communication protocols for enhanced data protection.
[5] Efficiently simulating molecular interactions and chemical reactions, enabling the identification of new pharmaceutical compounds and optimization of drug properties at a faster pace than traditional methods.
[6] Rapidly solving complex optimization problems, improving risk analysis, and enhancing portfolio management, leading to more accurate predictions and better decision-making tools for financial institutions.
[7] For example, classical computers can be faster when doing simple arithmetic and basic calculations, text processing and data management, and running most everyday applications (like web browsers and office suites).