Quantum

D-Wave Annealing Quantum Computers Tackle Big Data With A ML Quantum Boltzmann Machine (Artificial Neural Network)

Every two seconds, sensors measuring the United States’ electrical grid collect 3 petabytes of data – the equivalent of 3 million gigabytes. Data analysis on that scale is a challenge when crucial information is stored in an inaccessible database.

But researchers at Purdue University are working on a solution, combining with classical computing on small-scale quantum computers to speed up database accessibility. They are using data from the U.S. Department of Energy National Labs’ sensors, called phasor measurement units, that collect information on the electrical power grid about voltages, currents and power generation. Because these values can vary, keeping the power grid stable involves continuously monitoring the sensors.

Sabre Kais, a professor of chemical physics and principal investigator, will lead the effort to develop new quantum algorithms for computing the extensive data generated by the .

“Non-quantum algorithms that are used to analyze the data can predict the state of the grid, but as more and more phasor measurement units are deployed in the electrical network, we need faster algorithms,” said Alex Pothen, professor of computer science and co-investigator on the project. “Quantum algorithms for have the potential to speed up the computations substantially in a theoretical sense, but great challenges remain in achieving quantum computers that can process such large amounts of data.”

The research team’s method has potential for a number of practical applications, such as helping industries optimize their supply-chain and logistics management. It could also lead to new chemical and material discovery using an artificial neural network known as a quantum Boltzmann machine. This kind of neural network is used for machine learning and data analysis.

“We have already developed a hybrid quantum employing a quantum Boltzmann machine to obtain accurate electronic structure calculations,” Kais said. “We have proof of concept showing results for small molecular systems, which will allow us to screen molecules and accelerate the discovery of new materials.”

A paper outlining these results was published Wednesday in the journal Nature Communications.

Machine learning algorithms have been used to calculate the approximate electronic properties of millions of small molecules, but navigating these molecular systems is challenging for chemical physicists. Kais and co-investigator Yong Chen, director of the Purdue Quantum Center and professor of physics and astronomy and of electrical and computer engineering, are confident that their quantum machine learning algorithm could address this.

Their algorithms could also be used for optimizing solar farms. The lifetime of a solar farm varies depending on the climate as solar cells degrade each year from weather, according to Muhammad Alam, professor of electrical and computer engineering and a co-investigator of the project. Using quantum algorithms would make it easier to determine the lifetime of solar farms and other sustainable energy technologies for a given geographical location and could help make solar technologies more efficient.

Additionally, the team hopes to launch an externally-funded industry-university collaborative research center (IUCRC) to promote further research in quantum machine learning for data analytics and optimization. Benefits of an IUCRC include leveraging academic-corporate partnerships, expanding material science research, and acting on market incentive. Further research in quantum machine learning for data analysis is necessary before it can be of use to industries for practical application, Chen said, and an IUCRC would make tangible progress.

“We are close to developing the classical algorithms for this data analysis, and we expect them to be widely used,” Pothen said. “Quantum algorithms are high-risk, high-reward research, and it is difficult to predict in what time frame these algorithms will find practical use.”

The team’s research project was one of eight selected by the Purdue’s Integrative Data Science Initiative to be funded for a two-year period. The initiative will encourage interdisciplinary collaboration and build on Purdue’s strengths to position the university as a leader in data science research and focus on one of four areas:

  • health care
  • defense
  • ethics, society and policy
  • fundamentals, methods, and algorithms.

The research thrusts of the Integrative Data Science Initiative is hosted by Purdue’s Discovery Park.

“This is an exciting time to combine machine learning with ,” Kais said. “Impressive progress has been made recently in building quantum computers, and techniques will become powerful tools for finding new patterns in big data.

from: https://phys.org/news/2018-10-quantum-tackle-big-machine.html

Quantum machine learning

Quantum machine learning is an emerging interdisciplinary research area at the intersection of quantum physics and machine learning. The most common use of the term refers to machine learning algorithms for the analysis of classical data executed on a quantum computer. This includes hybrid methods that involve both classical and quantum processing, where computationally expensive subroutines are outsourced to a quantum device. Furthermore, quantum algorithms can be used to analyze quantum states instead of classical data. Beyond quantum computing, the term “quantum machine learning” is often associated with machine learning methods applied to data generated from quantum experiments, such as learning quantum phase transitions or creating new quantum experiments. Quantum machine learning also extends to a branch of research that explores methodological and structural similarities between certain physical systems and learning systems, in particular neural networks. For example, some mathematical and numerical techniques from quantum physics carry over to classical deep learning and vice versa. Finally, researchers investigate more abstract notions of learning theory with respect to quantum information, sometimes referred to as “quantum learning theory”.

Quantum sampling techniques

Sampling from high-dimensional probability distributions is at the core of a wide spectrum of computational techniques with important applications across science, engineering, and society. Examples include deep learning, probabilistic programming, and other machine learning and artificial intelligence applications.

A computationally hard problem, which is key for some relevant machine learning tasks, is the estimation of averages over probabilistic models defined in terms of a Boltzmann distribution. Sampling from generic probabilistic models is hard: algorithms relying heavily on sampling are expected to remain intractable no matter how large and powerful classical computing resources become. Even though quantum annealers, like those produced by D-Wave Systems, were designed for challenging combinatorial optimization problems, it has been recently recognized as a potential candidate to speed up computations that rely on sampling by exploiting quantum effects.

Some research groups have recently explored the use of quantum annealing hardware for training Boltzmann machines and deep neural networks. The standard approach to training Boltzmann machines relies on the computation of certain averages that can be estimated by standard sampling techniques, such as Markov chain Monte Carlo algorithms. Another possibility is to rely on a physical process, like quantum annealing, that naturally generates samples from a Boltzmann distribution. The objective is to find the optimal control parameters that best represent the empirical distribution of a given dataset.

The D-Wave 2X system hosted at NASA Ames Research Center has been recently used for the learning of a special class of restricted Boltzmann machines that can serve as a building block for deep learning architectures. Complementary work that appeared roughly simultaneously showed that quantum annealing can be used for supervised learning in classification tasks. The same device was later used to train a fully connected Boltzmann machine to generate, reconstruct, and classify down-scaled, low-resolution handwritten digits, among other synthetic datasets. In both cases, the models trained by quantum annealing had a similar or better performance in terms of quality. The ultimate question that drives this endeavour is whether there is quantum speedup in sampling applications. Experience with the use of quantum annealers for combinatorial optimization suggests the answer is not straightforward.

Inspired by the success of Boltzmann machines based on classical Boltzmann distribution, a new machine learning approach based on quantum Boltzmann distribution of a transverse-field Ising Hamiltonian was recently proposed. Due to the non-commutative nature of quantum mechanics, the training process of the quantum Boltzmann machine can become nontrivial. This problem was, to some extent, circumvented by introducing bounds on the quantum probabilities, allowing the authors to train the model efficiently by sampling. It is possible that a specific type of quantum Boltzmann machine has been trained in the D-Wave 2X by using a learning rule analogous to that of classical Boltzmann machines.

Quantum annealing is not the only technology for sampling. In a prepare-and-measure scenario, a universal quantum computer prepares a thermal state, which is then sampled by measurements. This can reduce the time required to train a deep restricted Boltzmann machine, and provide a richer and more comprehensive framework for deep learning than classical computing. The same quantum methods also permit efficient training of full Boltzmann machines and multi-layer, fully connected models and do not have well-known classical counterparts. Relying on an efficient thermal state preparation protocol starting from an arbitrary state, quantum-enhanced Markov logic networks exploit the symmetries and the locality structure of the probabilistic graphical model generated by a first-order logic template. This provides an exponential reduction in computational complexity in probabilistic inference, and, while the protocol relies on a universal quantum computer, under mild assumptions it can be embedded on contemporary quantum annealing hardware.

Quantum neural networks

Quantum analogues or generalizations of classical neural nets are often referred to as quantum neural networks. The term is claimed by a wide range of approaches, including the implementation and extension of neural networks using photons, layered variational circuits or quantum Ising-type models.

Boltzmann Distribution

In statistical mechanics and mathematics, a Boltzmann distribution (also called Gibbs distribution) is a probability distribution, probability measure, or frequency distribution of particles in a system over various possible states. The distribution is expressed in the form

 
where is state energy (which varies from state to state), and (a constant of the distribution) is the product of Boltzmann’s constant and thermodynamic temperature.

In statistical mechanics, the Boltzmann distribution is a probability distribution that gives the probability that a system will be in a certain state as a function of that state’s energy and the temperature of the system. It is given as

 

where pi is the probability of state i, εi the energy of state i, k the Boltzmann constant, T the temperature of the system, and M is the number of states accessible to the system. The sum is over all states accessible to the system of interest. The term system here has a very wide meaning; it can range from a single atom to a macroscopic system such as a natural gas storage tank. Because of this the Boltzmann distribution can be used to solve a very wide variety of problems. The distribution shows that states with lower energy will always have a higher probability of being occupied than the states with higher energy.

 

Reinforcement Learning Using Quantum Boltzmann Machines

By Daniel Crawford, Anna Levit, Navid Ghadermarzy, Jaspreet S. Oberoi, & Pooya Ronagh

We investigate whether quantum annealers with select chip layouts can outperform classical computers in reinforcement learning tasks. We associate a transverse fi eld Ising spin Hamiltonian with a layout of qubits similar to that of a deep Boltzmann machine (DBM) and use simulated quantum annealing (SQA) to numerically simulate quantum sampling from this system. We design a reinforcement learning algorithm in which the set of visible nodes representing the states and actions of an optimal policy are the fi rst and last layers of the deep network. In absence of a transverse field, our simulations show that DBMs train more effectively than restricted Boltzmann machines (RBM) with the same number of weights. Since sampling from Boltzmann distributions of a DBM is not classically feasible, this is evidence of advantage of a non-Turing sampling oracle. We then develop a framework for training the network as a quantum Boltzmann machine (QBM) in the presence of a signifi cant transverse field for reinforcement learning. This further improves the reinforcement learning method using DBMs.

 

Video:

Quantum Boltzmann Machine using a Quantum Annealer

Recording Details

Speaker(s):
Scientific Areas:
Collection/Series:
PIRSA Number:
16080009

Abstract

Machine learning is a rapidly growing field in computer science with applications in computer vision, voice recognition, medical diagnosis, spam filtering, search engines, etc. In this presentation, I will introduce a new machine learning approach based on quantum Boltzmann distribution of a transverse-field Ising Model. Due to the non-commutative nature of quantum mechanics, the training process of the Quantum Boltzmann Machine (QBM) can become nontrivial.  I will show how to circumvent this problem by introducing bounds on the quantum probabilities. This allows training the QBM efficiently by sampling. I will then show examples of QBM training with and without the bound, using exact diagonalization, and compare the results with classical Boltzmann training. Finally, after a brief introduction to D-Wave quantum annealing processors, I will discuss the possibility of using such processors for QBM training and application.

 
 
see the video (40 mins) of the presentation:
https://perimeterinstitute.ca/videos/quantum-boltzmann-machine-using-quantum-annealer
 
 
 

LinkedIn Post Analytics for the first 24 hours of this post on the network: