БЛОГ

Archive for the ‘supercomputing’ category: Page 31

Jan 20, 2022

The Human Brain-Scale AI Supercomputer Is Coming

Posted by in categories: government, robotics/AI, supercomputing

What’s next? Human brain-scale AI.

Funded by the Slovakian government using funds allocated by the EU, the I4DI consortium is behind the initiative to build a 64 AI exaflop machine (that’s 64 billion, billion AI operations per second) on our platform by the end of 2022. This will enable Slovakia and the EU to deliver for the first time in the history of humanity a human brain-scale AI supercomputer. Meanwhile, almost a dozen other countries are watching this project closely, with interest in replicating this supercomputer in their own countries.

There are multiple approaches to achieve human brain-like AI. These include machine learning, spiking neural networks like SpiNNaker, neuromorphic computing, bio AI, explainable AI and general AI. Multiple AI approaches require universal supercomputers with universal processors for humanity to deliver human brain-scale AI.

Jan 20, 2022

Quantum Computer With More Than 5,000 Qubits Launched

Posted by in categories: quantum physics, supercomputing

Official launch marks a milestone in the development of quantum computing in Europe.

A quantum annealer with more than 5,000 qubits has been put into operation at Forschungszentrum Jülich. The Jülich Supercomputing Centre (JSC) and D-Wave Systems, a leading provider of quantum computing systems, today launched the company’s first cloud-based quantum service outside North America. The new system is located at Jülich and will work closely with the supercomputers at JSC in the future. The annealing quantum computer is part of the Jülich UNified Infrastructure for Quantum computing (JUNIQ), which was established in autumn 2019 to provide researchers in Germany and Europe with access to various quantum systems.

Jan 19, 2022

Light-matter interactions simulated on the world’s fastest supercomputer

Posted by in categories: physics, supercomputing

Light-matter interactions form the basis of many important technologies, including lasers, light-emitting diodes (LEDs), and atomic clocks. However, usual computational approaches for modeling such interactions have limited usefulness and capability. Now, researchers from Japan have developed a technique that overcomes these limitations.

In a study published this month in The International Journal of High Performance Computing Applications, a research team led by the University of Tsukuba describes a highly efficient method for simulating light-matter interactions at the atomic scale.

What makes these interactions so difficult to simulate? One reason is that phenomena associated with the interactions encompass many areas of physics, involving both the propagation of light waves and the dynamics of electrons and ions in matter. Another reason is that such phenomena can cover a wide range of length and time scales.

Jan 17, 2022

New Silicon Carbide Qubits Bring Us One Step Closer to Quantum Networks

Posted by in categories: quantum physics, supercomputing

Chromium defects in silicon carbide may provide a new platform for quantum information.

Quantum computers may be able to solve science problems that are impossible for today’s fastest conventional supercomputers. Quantum sensors may be able to measure signals that cannot be measured by today’s most sensitive sensors. Quantum bits (qubits) are the building blocks for these devices. Scientists are investigating several quantum systems for quantum computing and sensing applications. One system, spin qubits, is based on the control of the orientation of an electron’s spin at the sites of defects in the semiconductor materials that make up qubits. Defects can include small amounts of materials that are different from the main material a semiconductor is made of. Researchers recently demonstrated how to make high quality spin qubits based on chromium defects in silicon carbide.

Jan 11, 2022

Supercomputing! The Purest Indicator of Structural Technological and Economic Progress (1H 2022)

Posted by in categories: economics, supercomputing

How to check the trends of Supercomputing Progress, and how this is as close to a pure indicator of technological progress rates as one can find. The recent flattening of this trend has revealed a flattening in all technological and economic progress relative to long-term trendlines.

Top500.org chart : https://top500.org/statistics/perfdevel/

Continue reading “Supercomputing! The Purest Indicator of Structural Technological and Economic Progress (1H 2022)” »

Jan 11, 2022

Nanowire transistor with integrated memory to enable future supercomputers

Posted by in categories: nanotechnology, robotics/AI, supercomputing

For many years, a bottleneck in technological development has been how to get processors and memories to work faster together. Now, researchers at Lund University in Sweden have presented a new solution integrating a memory cell with a processor, which enables much faster calculations, as they happen in the memory circuit itself.

In an article in Nature Electronics, the researchers present a new configuration, in which a cell is integrated with a vertical transistor selector, all at the nanoscale. This brings improvements in scalability, speed and compared with current mass storage solutions.

The fundamental issue is that anything requiring large amounts of data to be processed, such as AI and , requires speed and more capacity. For this to be successful, the memory and processor need to be as close to each other as possible. In addition, it must be possible to run the calculations in an energy-efficient manner, not least as current technology generates high temperatures with high loads.

Jan 10, 2022

Newcomer Conduit Leverages Frontera to Understand SARS-CoV-2 ‘Budding’

Posted by in categories: biotech/medical, genetics, supercomputing

I am happy to say that my recently published computational COVID-19 research has been featured in a major news article by HPCwire! I led this research as CTO of Conduit. My team utilized one of the world’s top supercomputers (Frontera) to study the mechanisms by which the coronavirus’s M proteins and E proteins facilitate budding, an understudied part of the SARS-CoV-2 life cycle. Our results may provide the foundation for new ways of designing antiviral treatments which interfere with budding. Thank you to Ryan Robinson (Conduit’s CEO) and my computational team: Ankush Singhal, Shafat M., David Hill, Jr., Tamer Elkholy, Kayode Ezike, and Ricky Williams.


Conduit, created by MIT graduate (and current CEO) Ryan Robinson, was founded in 2017. But it might not have been until a few years later, when the pandemic started, that Conduit may have found its true calling. While Conduit €™s commercial division is busy developing a Covid-19 test called nanoSPLASH, its nonprofit arm was granted access to one of the most powerful supercomputers in the world €”Frontera, at the Texas Advanced Computing Center (TACC) €”to model the €œbudding € process of SARS-CoV-2.

Budding, the researchers explained, is how the virus €™ genetic material is encapsulated in a spherical envelope €”and the process is key to the virus €™ ability to infect. Despite that, they say, it has hitherto been poorly understood:

Continue reading “Newcomer Conduit Leverages Frontera to Understand SARS-CoV-2 ‘Budding’” »

Jan 5, 2022

Bug in backup software results in loss of 77 terabytes of research data at Kyoto University

Posted by in categories: cybercrime/malcode, supercomputing

Computer maintenance workers at Kyoto University have announced that due to an apparent bug in software used to back up research data, researchers using the University’s Hewlett-Packard Cray computing system, called Lustre, have lost approximately 77 terabytes of data. The team at the University’s Institute for Information Management and Communication posted a Failure Information page detailing what is known so far about the data loss.

The team, with the University’s Information Department Information Infrastructure Division, Supercomputing, reported that files in the /LARGEO (on the DataDirect ExaScaler storage system) were lost during a system backup procedure. Some in the press have suggested that the problem arose from a faulty script that was supposed to delete only old, unneeded log files. The team noted that it was originally thought that approximately 100TB of files had been lost, but that number has since been pared down to 77TB. They note also that the failure occurred on December 16 between the hours of 5:50 and 7pm. Affected users were immediately notified via emails. The team further notes that approximately 34 million files were lost and that the files lost belonged to 14 known research groups. The team did not release information related to the names of the research groups or what sort of research they were conducting. They did note data from another four groups appears to be restorable.

Jan 1, 2022

Kyoto University Loses 77 Terabytes of Research Data After Supercomputer Backup Error

Posted by in categories: climatology, engineering, quantum physics, supercomputing, sustainability

Unfortunately, some of the data is lost forever. 🧐

#engineering


A routine backup procedure meant to safeguard data of researchers at Kyoto University in Japan went awry and deleted 77 terabytes of data, Gizmodo reported. The incident occurred between December 14 and 16, first came to light on the 16th, and affected as many as 14 research groups at the university.

Continue reading “Kyoto University Loses 77 Terabytes of Research Data After Supercomputer Backup Error” »

Dec 26, 2021

One of the World’s Most Powerful Supercomputers Uses Light Instead of Electric Current

Posted by in categories: quantum physics, robotics/AI, supercomputing

France’s Jean Zay supercomputer, one of the most powerful computers in the world and part of the Top500, is now the first HPC to have a photonic coprocessor meaning it transmits and processes information using light. The development represents a first for the industry.

The breakthrough was made during a pilot program that saw LightOn collaborate with GENCI and IDRIS. Igor Carron, LightOn’s CEO and co-founder said in a press release: “This pilot program integrating a new computing technology within one of the world’s Supercomputers would not have been possible without the particular commitment of visionary agencies such as GENCI and IDRIS/CNRS. Together with the emergence of Quantum Computing, this world premiere strengthens our view that the next step after exascale supercomputing will be about hybrid computing.”

The technology will now be offered to select users of the Jean Zay research community over the next few months who will use the device to undertake research on machine learning foundations, differential privacy, satellite imaging analysis, and natural language processing (NLP) tasks. LightOn’s technology has already been successfully used by a community of researchers since 2018.

Page 31 of 81First2829303132333435Last