Menu

Blog

Archive for the ‘supercomputing’ category: Page 59

Aug 4, 2020

Calculating the benefits of exascale and quantum computers

Posted by in categories: information science, quantum physics, supercomputing

A quintillion calculations a second. That’s one with 18 zeros after it. It’s the speed at which an exascale supercomputer will process information. The Department of Energy (DOE) is preparing for the first exascale computer to be deployed in 2021. Two more will follow soon after. Yet quantum computers may be able to complete more complex calculations even faster than these up-and-coming exascale computers. But these technologies complement each other much more than they compete.

It’s going to be a while before quantum computers are ready to tackle major scientific research questions. While quantum researchers and scientists in other areas are collaborating to design quantum computers to be as effective as possible once they’re ready, that’s still a long way off. Scientists are figuring out how to build qubits for quantum computers, the very foundation of the technology. They’re establishing the most fundamental quantum algorithms that they need to do simple calculations. The hardware and algorithms need to be far enough along for coders to develop operating systems and software to do scientific research. Currently, we’re at the same point in that scientists in the 1950s were with computers that ran on vacuum tubes. Most of us regularly carry computers in our pockets now, but it took decades to get to this level of accessibility.

In contrast, exascale computers will be ready next year. When they launch, they’ll already be five times faster than our fastest —Summit, at Oak Ridge National Laboratory’s Leadership Computing Facility, a DOE Office of Science user facility. Right away, they’ll be able to tackle major challenges in modeling Earth systems, analyzing genes, tracking barriers to fusion, and more. These powerful machines will allow scientists to include more variables in their equations and improve models’ accuracy. As long as we can find new ways to improve conventional computers, we’ll do it.

Aug 1, 2020

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

Posted by in categories: quantum physics, supercomputing

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Last week a Google draft paper, discovered by the Financial Times, claimed attaining Quantum Supremacy using a 53-qubit superconducting processor. The paper found on NASA’s website was later withdrawn. Conversation around it has been bubbling in the QC community since.

More on D-Wave’s announcements later – the Advantage system isn’t expected to be broadly available until mid-2020 which is roughly in keeping with its stated plans. The Google work on quantum supremacy is fascinating. Google has declined to comment on the paper. How FT became aware of the paper isn’t clear. A few observers suggest it looks like an early draft.

Quantum supremacy, of course, is the notion of a quantum computer doing something that classical computers simply can’t reasonably do. In this instance, the reported Google paper claimed it was able to perform as task (a particular random number generation) on its QC in 200 seconds versus what would take on the order 10,000 years on a supercomputer. In an archived copy of the draft that HPCwire was able to find, the authors say they “estimated the classical computational cost” of running supremacy circuits on Summit and on a large Google cluster. (For an excellent discussion of quantum supremacy see Scott Aaronson’s (University of Texas) blog yesterday, Scott’s Supreme Quantum Supremacy FAQ)

Aug 1, 2020

Quantum machines learn ‘quantum data’

Posted by in categories: information science, quantum physics, robotics/AI, supercomputing

Skoltech scientists have shown that quantum enhanced machine learning can be used on quantum (as opposed to classical) data, overcoming a significant slowdown common to these applications and opening a “fertile ground to develop computational insights into quantum systems.” The paper was published in the journal Physical Review A.

Quantum computers utilize quantum mechanical effects to store and manipulate information. While quantum effects are often claimed to be counterintuitive, such effects will enable quantum enhanced calculations to dramatically outperform the best supercomputers. In 2019, the world saw a prototype of this demonstrated by Google as quantum computational superiority.

Quantum algorithms have been developed to enhance a range of different computational tasks; more recently this has grown to include quantum enhanced machine learning. Quantum machine learning was partly pioneered by Skoltech’s resident-based Laboratory for Quantum Information Processing, led by Jacob Biamonte, a coathor of this paper. “Machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that are thought not to produce efficiently, so it is not surprising that quantum computers might outperform classical computers on machine learning tasks,” he says.

Jul 29, 2020

Google wins MLPerf benchmark contest with fastest ML training supercomputer

Posted by in categories: robotics/AI, supercomputing

Fast training of machine learning (ML) models is critical for research and engineering teams that deliver new products, services, and research breakthroughs that were previously out of reach. Here at Google, recent ML-enabled advances have included more helpful search results and a single ML model that can translate 100 different languages.

The latest results from the industry-standard MLPerf benchmark competition demonstrate that Google has built the world’s fastest ML training supercomputer. Using this supercomputer, as well as our latest Tensor Processing Unit (TPU) chip, Google set performance records in six out of eight MLPerf benchmarks.

Jul 29, 2020

Solving materials problems with a quantum computer

Posted by in categories: chemistry, engineering, information science, particle physics, quantum physics, supercomputing

Quantum computers have enormous potential for calculations using novel algorithms and involving amounts of data far beyond the capacity of today’s supercomputers. While such computers have been built, they are still in their infancy and have limited applicability for solving complex problems in materials science and chemistry. For example, they only permit the simulation of the properties of a few atoms for materials research.

Scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory and the University of Chicago (UChicago) have developed a method paving the way to using quantum computers to simulate realistic molecules and complex materials, whose description requires hundreds of atoms.

The research team is led by Giulia Galli, director of the Midwest Integrated Center for Computational Materials (MICCoM), a group leader in Argonne’s Materials Science division and a member of the Center for Molecular Engineering at Argonne. Galli is also the Liew Family Professor of Electronic Structure and Simulations in the Pritzker School of Molecular Engineering and a Professor of Chemistry at UChicago. She worked on this project with assistant scientist Marco Govoni and graduate student He Ma, both part of Argonne’s Materials Science division and UChicago.

Jul 27, 2020

700-petaflop AI supercomputer planned for 2021

Posted by in categories: robotics/AI, supercomputing

As the world edges closer towards exascale computing, the University of Florida has announced a partnership with chipmaker NVIDIA that aims to create a 700-petaflop AI supercomputer next year.

Jul 26, 2020

New Argonne supercomputer, built for next-gen AI, will be most powerful in U.S.

Posted by in categories: neuroscience, robotics/AI, supercomputing

“‘Aurora will enable us to explore new frontiers in artificial intelligence and machine learning,’ said Narayanan ‘Bobby’ Kasthuri, assistant professor of neurobiology at the University of Chicago and researcher at Argonne. ‘This will be the first time scientists have had a machine powerful enough to match the kind of computations the brain can do.’”

Super computer Aurora will help map the human brain at “quintillion—or one billion billion—calculations per second, 50 times quicker than today’s most powerful supercomputers.”

Note: the article discusses implications beyond neuroscience.

Continue reading “New Argonne supercomputer, built for next-gen AI, will be most powerful in U.S.” »

Jul 17, 2020

New learning algorithm should significantly expand the possible applications of AI

Posted by in categories: information science, robotics/AI, supercomputing

The high energy consumption of artificial neural networks’ learning activities is one of the biggest hurdles for the broad use of Artificial Intelligence (AI), especially in mobile applications. One approach to solving this problem can be gleaned from knowledge about the human brain.

Although it has the computing power of a supercomputer, it only needs 20 watts, which is only a millionth of the of a supercomputer.

One of the reasons for this is the efficient transfer of information between in the brain. Neurons send short electrical impulses (spikes) to other neurons—but, to save energy, only as often as absolutely necessary.

Jul 16, 2020

Supercomputer reveals atmospheric impact of gigantic planetary collisions

Posted by in categories: space, supercomputing

The giant impacts that dominate late stages of planet formation have a wide range of consequences for young planets and their atmospheres, according to new research.

Research led by Durham University and involving the University of Glasgow, both UK, has developed a way of revealing the scale of atmosphere loss during planetary collisions based on 3D supercomputer simulations.

The simulations show how Earth-like planets with thin atmospheres might have evolved in an depending on how they are impacted by other objects.

Jul 9, 2020

The biggest flipping challenge in quantum computing

Posted by in categories: quantum physics, supercomputing

Such noise nearly drowned out the signal in Google’s quantum supremacy experiment. Researchers began by setting the 53 qubits to encode all possible outputs, which ranged from zero to 253. They implemented a set of randomly chosen interactions among the qubits that in repeated trials made some outputs more likely than others. Given the complexity of the interactions, a supercomputer would need thousands of years to calculate the pattern of outputs, the researchers said. So by measuring it, the quantum computer did something that no ordinary computer could match. But the pattern was barely distinguishable from the random flipping of qubits caused by noise. “Their demonstration is 99% noise and only 1% signal,” Kuperberg says.

To realize their ultimate dreams, developers want qubits that are as reliable as the bits in an ordinary computer. “You want to have a qubit that stays coherent until you switch off the machine,” Neven says.

Scientists’ approach of spreading the information of one qubit—a “logical qubit”—among many physical ones traces its roots to the early days of ordinary computers in the 1950s. The bits of early computers consisted of vacuum tubes or mechanical relays, which were prone to flip unexpectedly. To overcome the problem, famed mathematician John von Neumann pioneered the field of error correction.

Page 59 of 96First5657585960616263Last