Cookie Consent by Free Privacy Policy Generator ๐Ÿ“Œ Boosting quantum computer hardware performance with TensorFlow

๐Ÿ  Team IT Security News

TSecurity.de ist eine Online-Plattform, die sich auf die Bereitstellung von Informationen,alle 15 Minuten neuste Nachrichten, Bildungsressourcen und Dienstleistungen rund um das Thema IT-Sicherheit spezialisiert hat.
Ob es sich um aktuelle Nachrichten, Fachartikel, Blogbeitrรคge, Webinare, Tutorials, oder Tipps & Tricks handelt, TSecurity.de bietet seinen Nutzern einen umfassenden รœberblick รผber die wichtigsten Aspekte der IT-Sicherheit in einer sich stรคndig verรคndernden digitalen Welt.

16.12.2023 - TIP: Wer den Cookie Consent Banner akzeptiert, kann z.B. von Englisch nach Deutsch รผbersetzen, erst Englisch auswรคhlen dann wieder Deutsch!

Google Android Playstore Download Button fรผr Team IT Security



๐Ÿ“š Boosting quantum computer hardware performance with TensorFlow


๐Ÿ’ก Newskategorie: AI Videos
๐Ÿ”— Quelle: blog.tensorflow.org

A guest article by Michael J. Biercuk, Harry Slatyer, and Michael Hush of Q-CTRL

Google recently announced the release of TensorFlow Quantum - a toolset for combining state-of-the-art machine learning techniques with quantum algorithm design. This was an important step to build tools for developers working on quantum applications - users operating primarily at the โ€œtop of the stackโ€.

In parallel weโ€™ve been building a complementary TensorFlow-based toolset working from the hardware level up - from the bottom of the stack. Our efforts have focused on improving the performance of quantum computing hardware through the integration of a set of techniques we call quantum firmware.

In this article weโ€™ll provide an overview of the fundamental driver for this work - combating noise and error in quantum computers - and describe how the team at Q-CTRL uses TensorFlow to efficiently characterize and suppress the impact of noise and imperfections in quantum hardware. These are key challenges in the global effort to make quantum computers useful.

Q-CTRL image

The Achilles heel of quantum computers - noise and error

Quantum computing, simply put, is a new way to process information using the laws of quantum physics - the rules that govern nature on tiny size scales. Through decades of effort in science and engineering weโ€™re now ready to put this physics to work solving problems that are exceptionally difficult for regular computers.

Realizing useful computations on todayโ€™s systems requires a recognition that performance is predominantly limited by hardware imperfections and failures, not system size. Susceptibility to noise and error remains the Achilles heel of quantum computers, and ultimately limits the range and utility of algorithms run on quantum computing hardware.

As a broad community average, most quantum computer hardware can run just a few dozen calculations over a time much less than one millisecond before requiring a reset due to the influence of noise. Depending on the specifics thatโ€™s about 1024 times worse than the hardware in a laptop!

This is the heart of why quantum computing is really hard. In this context, โ€œnoiseโ€ describes all of the things that cause interference in a quantum computer. Just like a mobile phone call can suffer interference leading it to break up, a quantum computer is susceptible to interference from all sorts of sources, like electromagnetic signals coming from WiFi or disturbances in the Earthโ€™s magnetic field.

When qubits in a quantum computer are exposed to this kind of noise, the information in them gets degraded just the way sound quality is degraded by interference on a call. In a quantum system this process is known as decoherence. Decoherence causes the information encoded in a quantum computer to become randomized - and this leads to errors when we execute an algorithm. The greater the influence of noise, the shorter the algorithm that can be run.

So what do we do about this? To start, for the past two decades teams have been working to make their hardware more passively stable - shielding it from the noise that causes decoherence. At the same time theorists have designed a clever algorithm called Quantum Error Correction that can identify and fix errors in the hardware, based in large part on classical error correction codes. This is essential in principle, but the downside is that to make it work you have to spread the information in one qubit over lots of qubits; it may take 1000 or more physical qubits to realize just one error-corrected โ€œlogical qubitโ€. Todayโ€™s machines are nowhere near capable of getting benefits from this kind of Quantum Error Correction.

Q-CTRL adds something extra - quantum firmware - which can stabilize the qubits against noise and decoherence without the need for extra resources. It does this by adding new solutions at the lowest layer of the quantum computing stack that improve the hardwareโ€™s robustness to error.

Building quantum firmware with TensorFlow

Quantum firmware graphic

Quantum firmware describes a set of protocols whose purpose is to deliver quantum hardware with augmented performance to higher levels of abstraction in the quantum computing stack. The choice of the term firmware reflects the fact that the relevant routines are usually software-defined but embedded proximal to the physical layer and effectively invisible to higher layers of abstraction.

Quantum computing hardware generally relies on a form of precisely engineered light-matter interaction in order to enact quantum logic operations. These operations in a sense constitute the native machine language for a quantum computer; a timed pulse of microwaves on resonance with a superconducting qubit can translate to an effective bit-flip operation while another pulse may implement a conditional logic operation between a pair of qubits. An appropriate composition of these electromagnetic signals then implements the target quantum algorithm.

Quantum firmware determines how the physical hardware should be manipulated, redefining the hardware machine language in a way that improves stability against decoherence. Key to this process is the calculation of noise-robust operations using information gleaned from the hardware itself.

Building in TensorFlow was essential to moving beyond โ€œhome-builtโ€™โ€™ code to commercial-grade products for Q-CTRL. Underpinning these techniques (formally coming from the field of quantum control) are tools allowing us to perform complex gradient-based optimizations. We express all optimization problems as data flow graphs, which describe how optimization variables (variables that can be tuned by the optimizer) are transformed into the cost function (the objective that the optimizer attempts to minimize). We combine custom convenience functions with access to TensorFlow primitives in order to efficiently perform optimizations as used in many different parts of our workflow. And critically, we exploit TensorFlowโ€™s efficient gradient calculation tools to address what is often the weakest link in home-built implementations, especially as the analytic form of the relevant function is often nonlinear and contains many complex dependencies.

For example, consider the case of defining a numerically optimized error-robust quantum bit flip used to manipulate a qubit - the analog of a classical NOT gate. As mentioned above, in a superconducting qubit this is achieved using a pulse of microwaves. We have the freedom to โ€œshapeโ€ various aspects of the envelope of the pulse in order to enact the same mathematical transformation in a way that exhibits robustness against common noise sources, such as fluctuations in the strength or frequency of the microwaves.

To do this we first define the data flow graph used to optimize the manipulation of this qubit - it includes objects that describe available โ€œknobsโ€ to adjust, the sources of noise, and the target operation (here a Hadamard gate).

data flow graph
The data flow graph used to optimize quantum controls. The loop at left is run through our TensorFlow optimization engine

Once the graph has been defined inside our context manager, an object must be created that ties together the objective function (in this case minimizing the resultant gate error) and the desired outputs defining the shape of the microwave pulse. With the graph object created, an optimization can be run using a service that returns a new graph object containing the results of the optimization.

This structure allows us to simply create helper functions which enable physically motivated constraints to be built directly into the graph. For instance, these might be symmetry requirements, limits on how a signal changes in time, or even incorporation of characteristics of the electronics systems used to generate the microwave pulses. Any other capabilities not directly covered by this library of helper functions can also be directly coded as TensorFlow primitives.

With this approach we achieve an extremely flexible and high-performance optimization engine; our direct benchmarking has revealed order-of-magnitude benefits in time to solution relative to the best available alternative architectures.

The capabilities enabled by this toolkit span the space of tasks required to stabilize quantum computing hardware and reduce errors at the lowest layer of the quantum computing stack. And importantly theyโ€™re experimentally verified on real quantum computing hardware; quantum firmware has been shown to reduce the likelihood of errors, mitigate system performance variations across devices, stabilize hardware against slowly drifting out of calibration, and even make quantum logic operations more compatible with higher level abstractions in quantum computing such as quantum error correction. All of these capabilities and real hardware demonstrations are accessible via our publicly available User Guides and Application Notes in executable Jupyter notebook form.

Ultimately, we believe that building and operating large-scale quantum computing systems will be effectively impossible without the integration of the capabilities encapsulated in quantum firmware. There are many concepts to be drawn from across the fields of machine learning and robotic control in the drive for performance and autonomy, and TensorFlow has proven an efficient language to support the development of the critical toolsets.

A brief history of QC, from Shor to quantum machine learning

The quantum computing boom started in 1994 with the discovery of Shorโ€™s algorithm for factoring large numbers. Public key cryptosystems โ€” which is to say, most encryption โ€” rely on the mathematical complexity of factoring primes to keep messages safe from prying computers. By virtue of their approach to encoding and processing information, however, quantum computers are conjectured to be able to factor primes faster โ€” exponentially faster โ€” than a classical machine. In principle this poses an existential threat not only to national security, but also emerging technologies such as cryptocurrencies.

This realization set in motion the development of the entire field of quantum computing. Shorโ€™s algorithm spurred the NSA to begin one of its first ever open, University-driven research programs asking the question of whether such systems could be built. Fast forward to 2020 and quantum supremacy has been achieved, meaning that a real quantum computing hardware system has performed a task thatโ€™s effectively impossible for even the worldโ€™s largest supercomputers.

Quantum supremacy is an important technical milestone whose practical importance in solving problems of relevance to end users remains a bit unclear. Our community is continuing to make great progress towards quantum advantage - a threshold indicating that itโ€™s actually cheaper or faster to use a quantum computer for a problem of practical relevance. And for the right problems, we think that within the next 5-10 years weโ€™ll cross that threshold with a quantum computer that isnโ€™t that much bigger than the ones we have today. It just needs to perform much better.

So, which problems are the right problems for quantum computers to address first?

In many respects, Shorโ€™s algorithm has receded in importance as the scale of the challenge emerged. A recent technical analysis suggests that weโ€™re unlikely to see Shor deployed at a useful scale until 2039. Today, small-scale machines with a couple of dozen interacting qubits exist in labs around the world, built from superconducting circuits, individual trapped atoms, or similarly exotic materials. The problem is that these early machines are just too small and too fragile to solve problems relevant to factoring.

To factor a number sufficiently large to be relevant in cryptography, one would need a system composed of thousands of qubits capable of handling trillions of operations each. This is nothing for a conventional machine where hardware can run for a billion years at a billion operations per second and never be likely to suffer a fault. But as weโ€™ve seen itโ€™s quite a different story for quantum computers.

These limits have driven the emergence of a new class of applications in materials science and chemistry that could prove equally impactful, using much smaller systems. Quantum computing in the near term could also help develop new classes of artificial intelligence systems. Recent efforts have demonstrated a strong and unexpected link between quantum computation and artificial neural networks, potentially portending new approaches to machine learning.

This class of problem can often be cast as optimizations where input into a classical machine learning algorithm comes from a small quantum computation, or where data is represented in the quantum domain and a learning procedure implemented. TensorFlow Quantum provides an exciting toolset for developers seeking new and improved ways to exploit the small quantum computers existing now and in the near future.

Still, even those small machines donโ€™t perform particularly well. Q-CTRLโ€™s quantum firmware enables users to extract maximum performance from hardware. Thus we see that TensorFlow has a critical role to play across the emerging quantum computing software stack - from quantum firmware through to algorithms for quantum machine learning.

Resources if youโ€™d like to learn more

We appreciate that members of the TensorFlow community may have varying levels of familiarity with quantum computing, and that this overview was only a starting point. To help readers interested in learning more about quantum computing weโ€™re happy to provide a few resources:

  • For those knowledgeable about machine learning, Q-CTRL has also produced a series of webinars introducing the concept of Robust Control in quantum computing and even demonstrating reinforcement learning to discover gates on real quantum hardware.
  • If you need to start from zero, Q-CTRL has produced a series of introductory video tutorials helping the uninitiated begin their quantum journey via our learning center. We also offer a visual interface enabling new users to discover and build intuition for the core concepts underlying quantum computing - including the impact of noise on quantum hardware.
  • Jack Hidary from X wrote a great text focused on linking the foundations of quantum computing with how teams today write code for quantum machines.
  • The traditional โ€œformalโ€ starting point for those interested in quantum computing is the timeless textbook from โ€œMike and Ikeโ€
...



๐Ÿ“Œ Boosting quantum computer hardware performance with TensorFlow


๐Ÿ“ˆ 59.73 Punkte

๐Ÿ“Œ Computer scientists create benchmarks to advance quantum computer performance


๐Ÿ“ˆ 31.12 Punkte

๐Ÿ“Œ TensorFlow Quantum: Googles neue Bibliothek fรผr Quantum-Machine-Learning


๐Ÿ“ˆ 30.15 Punkte

๐Ÿ“Œ TensorFlow Quantum: A software platform for hybrid quantum-classical ML (TF Dev Summit '20)


๐Ÿ“ˆ 30.15 Punkte

๐Ÿ“Œ Announcing TensorFlow Quantum: An Open Source Library for Quantum Machine Learning


๐Ÿ“ˆ 30.15 Punkte

๐Ÿ“Œ TensorFlow Quantum (Quantum Summer Symposium 2020)


๐Ÿ“ˆ 30.15 Punkte

๐Ÿ“Œ Quantum computing: Quantum annealing versus gate-based quantum computers


๐Ÿ“ˆ 28.78 Punkte

๐Ÿ“Œ China Makes Quantum Leap In Developing Quantum Computer


๐Ÿ“ˆ 26.29 Punkte

๐Ÿ“Œ Quantum computing: D-Wave shows off prototype of its next quantum annealing computer


๐Ÿ“ˆ 26.29 Punkte

๐Ÿ“Œ D-Wave's Quantum Computer Successfully Models a Quantum System


๐Ÿ“ˆ 26.29 Punkte

๐Ÿ“Œ IBM unveils Heron quantum processor and new modular quantum computer


๐Ÿ“ˆ 26.29 Punkte

๐Ÿ“Œ Quantum Computer Sets Record For Largest Ever Number of 'Logical Quantum Bits'


๐Ÿ“ˆ 26.29 Punkte

๐Ÿ“Œ Quantum Computer Sets Record For Largest Ever Number of 'Logical Quantum Bits'


๐Ÿ“ˆ 26.29 Punkte

๐Ÿ“Œ Boosting Performance Through App Quality Improvements (GDD Europe '17)


๐Ÿ“ˆ 25.1 Punkte

๐Ÿ“Œ Boosting the Real Time Performance of Gnome Shell 3.34 in Ubuntu 19.10


๐Ÿ“ˆ 25.1 Punkte

๐Ÿ“Œ Boosting the Real Time Performance of Gnome Shell 3.34 in Ubuntu 19.10


๐Ÿ“ˆ 25.1 Punkte

๐Ÿ“Œ Intel's Core i9 Processors Are Coming to Laptops, Boosting Gaming Performance


๐Ÿ“ˆ 25.1 Punkte

๐Ÿ“Œ AWS launches Amazon EC2 P4d instances, boosting performance for ML training and HPC


๐Ÿ“ˆ 25.1 Punkte

๐Ÿ“Œ Boosting Suricata Performance: A Guide to JA3 Fingerprints


๐Ÿ“ˆ 25.1 Punkte

๐Ÿ“Œ Boosting Self-Hosted GraphQL API Performance With Apollo Router and Federation


๐Ÿ“ˆ 25.1 Punkte

๐Ÿ“Œ Boosting Self-Hosted GraphQL API Performance With Apollo Router and Federation


๐Ÿ“ˆ 25.1 Punkte

๐Ÿ“Œ Jim Perrin: Boosting CentOS server performance


๐Ÿ“ˆ 25.1 Punkte

๐Ÿ“Œ Boosting Spark Union Operator Performance: Optimization Tips for Improved Query Speed


๐Ÿ“ˆ 25.1 Punkte

๐Ÿ“Œ Supercharge Your Python: 5 Tips for Boosting Performance


๐Ÿ“ˆ 25.1 Punkte

๐Ÿ“Œ Wear OS hybrid interface: Boosting power and performance


๐Ÿ“ˆ 25.1 Punkte

๐Ÿ“Œ A quantum computer just solved a decades-old problem three million times faster than a classical computer


๐Ÿ“ˆ 23.8 Punkte

๐Ÿ“Œ Quantum Computer Solves Decades-Old Problem Three Million Times Faster Than a Classical Computer


๐Ÿ“ˆ 23.8 Punkte

๐Ÿ“Œ Deep Learning: Tensorflow Lite wird noch kleiner als Tensorflow Mobile


๐Ÿ“ˆ 21.92 Punkte

๐Ÿ“Œ TensorFlow 101 (Really Awesome Intro Into TensorFlow)


๐Ÿ“ˆ 21.92 Punkte

๐Ÿ“Œ Evaluating TensorFlow models with TensorFlow Model Analysis


๐Ÿ“ˆ 21.92 Punkte

๐Ÿ“Œ TensorFlow Enterprise: Productionizing TensorFlow with Google Cloud (TF Dev Summit '20)


๐Ÿ“ˆ 21.92 Punkte

๐Ÿ“Œ Train your TensorFlow model on Google Cloud using TensorFlow Cloud


๐Ÿ“ˆ 21.92 Punkte

๐Ÿ“Œ Medium CVE-2020-15195: Tensorflow Tensorflow


๐Ÿ“ˆ 21.92 Punkte

๐Ÿ“Œ Medium CVE-2020-15210: Tensorflow Tensorflow


๐Ÿ“ˆ 21.92 Punkte

๐Ÿ“Œ Medium CVE-2020-15204: Tensorflow Tensorflow


๐Ÿ“ˆ 21.92 Punkte











matomo