БЛОГ

Archive for the ‘supercomputing’ category: Page 51

Oct 23, 2019

Google officially lays claim to quantum supremacy

Posted by in categories: quantum physics, supercomputing

The quantum computer Sycamore reportedly performed a calculation that even the most powerful supercomputers available can’t reproduce.

Oct 21, 2019

New supercomputer simulations explore magnetic reconnection and make a surprising discovery

Posted by in categories: cosmology, mobile phones, supercomputing

Magnetic reconnection, a process in which magnetic field lines tear and come back together, releasing large amounts of kinetic energy, occurs throughout the universe. The process gives rise to auroras, solar flares and geomagnetic storms that can disrupt cell phone service and electric grids on Earth. A major challenge in the study of magnetic reconnection, however, is bridging the gap between these large-scale astrophysical scenarios and small-scale experiments that can be done in a lab.

Researchers have now overcome this barrier through a combination of clever experiments and cutting-edge simulations. In doing so, they have uncovered a previously unknown role for a universal process called the “Biermann battery effect,” which turns out to impact magnetic in unexpected ways.

The Biermann battery effect, a possible seed for the magnetic fields pervading our universe, generates an electric current that produces these fields. The surprise findings, made through , show the effect can play a significant role in the reconnection occurring when the Earth’s magnetosphere interacts with astrophysical plasmas. The effect first generates lines, but then reverses roles and cuts them like scissors slicing a rubber band. The sliced fields then reconnect away from the original reconnection point.

Oct 17, 2019

Why Computers Will Never Be Truly Conscious

Posted by in category: supercomputing

Attempts to build supercomputer brains have not even come close to the real thing.

Oct 14, 2019

New approach for the simulation of quantum chemistry—modelling the molecular architecture

Posted by in categories: chemistry, particle physics, quantum physics, supercomputing

Searching for new substances and developing new techniques in the chemical industry: tasks that are often accelerated using computer simulations of molecules or reactions. But even supercomputers quickly reach their limits. Now researchers at the Max Planck Institute of Quantum Optics in Garching (MPQ) have developed an alternative, analogue approach. An international team around Javier Argüello-Luengo, Ph.D. candidate at the Institute of Photonic Sciences (ICFO), Ignacio Cirac, Director and Head of the Theory Department at the MPQ, Peter Zoller, Director at the Institute of Quantum Optics and Quantum Information in Innsbruck (IQOQI), and others have designed the first blueprint for a quantum simulator that mimics the quantum chemistry of molecules. Like an architectural model can be used to test the statics of a future building, a molecule simulator can support investigating the properties of molecules. The results are now published in the scientific journal Nature.

Using hydrogen, the simplest of all , as an example, the global team of physicists from Garching, Barcelona, Madrid, Beijing and Innsbruck theoretically demonstrate that the quantum simulator can reproduce the behaviour of a real molecule’s . In their work, they also show how experimental physicists can build such a simulator step by step. “Our results offer a new approach to the investigation of phenomena appearing in quantum chemistry,” says Javier Argüello-Luengo. This is highly interesting for chemists because classical computers notoriously struggle to simulate chemical compounds, as molecules obey the laws of quantum physics. An electron in its shell, for example, can rotate to the left and right simultaneously. In a compound of many particles, such as a molecule, the number of these parallel possibilities multiplies. Because each electron interacts with each other, the complexity quickly becomes impossible to handle.

As a way out, in 1982, the American physicist Richard Feynman suggested the following: We should simulate quantum systems by reconstructing them as simplified models in the laboratory from , which are inherently quantum, and therefore implying a parallelism of the possibilities by default. Today, quantum simulators are already in use, for example to imitate crystals. They have a regular, three-dimensional atomic lattice which is imitated by several intersecting , the “optical lattice.” The intersection points form something like wells in an egg carton into which the are filled. The interaction between the atoms can be controlled by amplifying or attenuating the rays. This way researchers gain a variable model in which they can study atomic behavior very precisely.

Sep 27, 2019

DARPA aims to make networks 100 times speedier with FastNIC

Posted by in categories: internet, supercomputing

Having a slow connection is always frustrating, but just imagine how supercomputers feel. All those cores doing all kinds of processing at lightning speed, but in the end they’re all waiting on an outdated network interface to stay in sync. DARPA doesn’t like it. So DARPA wants to change it — specifically by making a new network interface a hundred times faster.

The problem is this. As DARPA estimates it, processors and memory on a computer or server can in a general sense work at a speed of roughly 1014 bits per second — that’s comfortably into the terabit region — and networking hardware like switches and fiber are capable of about the same.

“The true bottleneck for processor throughput is the network interface used to connect a machine to an external network, such as an Ethernet, therefore severely limiting a processor’s data ingest capability,” explained DARPA’s Jonathan Smith in a news post by the agency about the project. (Emphasis mine.)

Sep 21, 2019

Google researchers have reportedly achieved “quantum supremacy”

Posted by in categories: quantum physics, supercomputing

The news: According to a report in the Financial Times, a team of researchers from Google led by John Martinis have demonstrated quantum supremacy for the first time. This is the point at which a quantum computer is shown to be capable of performing a task that’s beyond the reach of even the most powerful conventional supercomputer. The claim appeared in a paper that was posted on a NASA website, but the publication was then taken down. Google did not respond to a request for comment from MIT Technology Review.

Why NASA? Google struck an agreement last year to use supercomputers available to NASA as benchmarks for its supremacy experiments. According to the Financial Times report, the paper said that Google’s quantum processor was able to perform a calculation in three minutes and 20 seconds that would take today’s most advanced supercomputer, known as Summit, around 10,000 years. In the paper, the researchers said that, to their knowledge, the experiment “marks the first computation that can only be performed on a quantum processor.”

Quantum speed up: Quantum machines are so powerful because they harness quantum bits, or qubits. Unlike classical bits, which are either a 1 or a 0, qubits can be in a kind of combination of both at the same time. Thanks to other quantum phenomena, which are described in our explainer here, quantum computers can crunch large amounts of data in parallel that conventional machines have to work through sequentially. Scientists have been working for years to demonstrate that the machines can definitively outperform conventional ones.

Sep 21, 2019

Ghost post! Google creates world’s most powerful computer, NASA ‘accidentally reveals’ …and then publication vanishes

Posted by in categories: quantum physics, supercomputing

Google’s new quantum computer reportedly spends mere minutes on the tasks the world’s top supercomputers would need several millennia to perform. The media found out about this after NASA “accidentally” shared the firm’s research.

The software engineers at Google have built the world’s most powerful computer, the Financial Times and Fortune magazine reported on Friday, citing the company’s now-removed research paper. The paper is said to have been posted on a website hosted by NASA, which partners with Google, but later quietly taken down, without explanation.

Google and NASA have refused to comment on the matter. A source within the IT giant, however, told Fortune that NASA had “accidentally” published the paper before its team could verify its findings.

Sep 20, 2019

HPE to acquire supercomputer manufacturer Cray for $1.3 billion

Posted by in category: supercomputing

Hewlett Packard Enterprise has reached an agreement to acquire Cray, the manufacturer of supercomputing systems.

HPE says the acquisition will cost $35 per share, in a transaction valued at approximately $1.3 billion, net of cash.

Antonio Neri, president and CEO, HPE, says: Answers to some of society’s most pressing challenges are buried in massive amounts of data.

Sep 13, 2019

Brain-inspired computing could tackle big problems in a small way

Posted by in categories: neuroscience, supercomputing

While computers have become smaller and more powerful and supercomputers and parallel computing have become the standard, we are about to hit a wall in energy and miniaturization. Now, Penn State researchers have designed a 2-D device that can provide more than yes-or-no answers and could be more brainlike than current computing architectures.

“Complexity scaling is also in decline owing to the non-scalability of traditional von Neumann computing architecture and the impending ‘Dark Silicon’ era that presents a severe threat to multi-core processor technology,” the researchers note in today’s (Sept 13) online issue of Nature Communications.

The Dark Silicon era is already upon us to some extent and refers to the inability of all or most of the devices on a computer chip to be powered up at once. This happens because of too much heat generated from a . Von Neumann architecture is the standard structure of most modern computers and relies on a digital approach—” yes” or “no” answers—where program instruction and data are stored in the same memory and share the same communications channel.

Sep 7, 2019

Scientists develop a deep learning method to solve a fundamental problem in statistical physics

Posted by in categories: biotech/medical, robotics/AI, supercomputing

A team of scientists at Freie Universität Berlin has developed an Artificial Intelligence (AI) method that provides a fundamentally new solution of the “sampling problem” in statistical physics. The sampling problem is that important properties of materials and molecules can practically not be computed by directly simulating the motion of atoms in the computer because the required computational capacities are too vast even for supercomputers. The team developed a deep learning method that speeds up these calculations massively, making them feasible for previously intractable applications. “AI is changing all areas of our life, including the way we do science,” explains Dr. Frank Noé, professor at Freie Universität Berlin and main author of the study. Several years ago, so-called deep learning methods bested human experts in pattern recognition—be it the reading of handwritten texts or the recognition of cancer cells from medical images. “Since these breakthroughs, AI research has skyrocketed. Every day, we see new developments in application areas where traditional methods have left us stuck for years. We believe our approach could be such an advance for the field of statistical physics.” The results were published in Science.

Statistical Physics aims at the calculation of properties of materials or molecules based on the interactions of their constituent components—be it a metal’s melting temperature, or whether an antibiotic can bind to the molecules of a bacterium and thereby disable it. With statistical methods, such properties can be calculated in the computer, and the properties of the material or the efficiency of a specific medication can be improved. One of the main problems when doing this calculation is the vast computational cost, explains Simon Olsson, a coauthor of the study: “In principle we would have to consider every single structure, that means every way to position all the atoms in space, compute its probability, and then take their average. But this is impossible because the number of possible structures is astronomically large even for small molecules.

Page 51 of 82First4849505152535455Last