Toggle light / dark theme

After years of dedicated research and over 5 million supercomputer computing hours, a team has created the world’s first high-resolution 3D radiation hydrodynamics simulations for exotic supernovae. This work is reported in The Astrophysical Journal.

Ke-Jung Chen at Academia Sinica Institute of Astronomy and Astrophysics (ASIAA) in Taiwan, led an international team and used the powerful supercomputers from the Lawrence Berkeley National Laboratory and the National Astronomical Observatory of Japan to make the breakthrough.

Supernova explosions are the most spectacular endings for massive stars, as they conclude their in a self-destructive manner, instantaneously releasing brightness equivalent to billions of suns, illuminating the entire universe.

Morgan Stanley released a report Monday, predicting a semiconductor-driven hopeful outlook for Musk’s company.

Tesla’s shares were up 9.5 percent yesterday. But what drove them up?

The investment banking firm issued a research note that upgraded the Elon Musk-owned automotive company’s rating from ‘equalweight’ to ‘overweight’ with a price target of $400 from a prior price target of $250. An ‘overweight’ rating means that the analysts, in this case Morgan Stanley (MS), expects Tesla’s stock to outperform its industry in the market.


Wikimedia Commons.

A Morgan Stanley research report.

Morgan Stanley says Tesla stock may surge by $500 billion because of it’s Dojo Supercomputer, in lieu of robotaxis and network services.


Dojo can open up “new addressable markets,” just like AWS did for Amazon.com Inc., analysts led by Adam Jonas wrote in a note, upgrading the stock to overweight from equal-weight and raising its 12-month price target to a Street-high $400 per share from $250.

Shares of Tesla, which have already more than doubled this year, rose as much as 6.1% in US premarket trading Monday. The stock was on track to add about $46 billion in market value. Morgan Stanley is one of Musk’s key advisory firms, including on the $44 billion takeover of Twitter Inc., now known as X.

The supercomputer, designed to handle massive amounts of data in training driving systems, may put Tesla at “an asymmetric advantage” in a market potentially worth $10 trillion, said Jonas, and could make software and services the biggest value driver for Tesla from here onward.

A machine-learning algorithm demonstrated the capability to process data that exceeds a computer’s available memory by identifying a massive data set’s key features and dividing them into manageable batches that don’t choke computer hardware. Developed at Los Alamos National Laboratory, the algorithm set a world record for factorizing huge data sets during a test run on Oak Ridge National Laboratory’s Summit, the world’s fifth-fastest supercomputer.

Equally efficient on laptops and supercomputers, the highly scalable solves hardware bottlenecks that prevent processing information from data-rich applications in , , social media networks, national security science and earthquake research, to name just a few.

“We developed an ‘out-of-memory’ implementation of the non-negative matrix factorization method that allows you to factorize larger than previously possible on a given hardware,” said Ismael Boureima, a computational physicist at Los Alamos National Laboratory. Boureima is first author of the paper in The Journal of Supercomputing on the record-breaking algorithm.

Tesla’s (TSLA) stock is rising in pre-market trading on an optimistic new report about the automaker’s Dojo supercomputer coming from Morgan Stanley.

The firm massively increased its price target on Tesla’s stock because of it.

Dojo is Tesla’s own custom supercomputer platform built from the ground up for AI machine learning and, more specifically, for video training using the video data coming from its fleet of vehicles.

It is expected to deliver performance up to eight times faster than its predecessor.

The Los Alamos National Laboratory (LANL) is in the final stages of setting up a new supercomputer dubbed Crossroads that will allow it to test the US nuclear stockpile without major tests, a press release said. The system has been supplied by Hewlett Packard and installation began in June of this year.

The Department of Energy (DoE) has been tasked with the responsibility of ensuring that the US nuclear stockpile can be relied upon if and when it needs to be used. For this purpose, the federal agency does not actually test the warheads but carries out simulations to determine the storage, maintenance, and efficacy of the weapons.

In a recently published article featured on the cover of the Biophysical Journal, Dr. Rafael Bernardi, assistant professor of biophysics at the Department of Physics at Auburn University, and Dr. Marcelo Melo, a postdoctoral researcher in Dr. Bernardi’s group, shed light on the transformative capabilities of the next generation of supercomputers in reshaping the landscape of biophysics.

The researchers at Auburn delve into the harmonious fusion of computational modeling and experimental , providing a perspective for a future in which discoveries are made with unparalleled precision. Rather than being mere observers, today’s biophysicists, with the aid of advanced high-performance computing (HPC), are now trailblazers who can challenge longstanding biological assumptions, illuminate intricate details, and even create new proteins or design novel molecular circuits.

One of the most important aspects discussed in their perspective article is the new ability of computational biophysicists to simulate complex that range from the subatomic to whole-cell models, in extraordinary detail.

Quantum technologies—and quantum computers in particular—have the potential to shape the development of technology in the future. Scientists believe that quantum computers will help them solve problems that even the fastest supercomputers are unable to handle yet. Large international IT companies and countries like the United States and China have been making significant investments in the development of this technology. But because quantum computers are based on different laws of physics than conventional computers, laptops, and smartphones, they are more susceptible to malfunction.

An interdisciplinary research team led by Professor Jens Eisert, a physicist at Freie Universität Berlin, has now found ways of testing the quality of quantum computers. Their study on the subject was recently published in the scientific journal Nature Communications. These scientific quality control tests incorporate methods from physics, computer science, and mathematics.

Quantum physicist at Freie Universität Berlin and author of the study, Professor Jens Eisert, explains the science behind the research. “Quantum computers work on the basis of quantum mechanical laws of physics, in which or ions are used as computational units—or to put it another way—controlled, minuscule physical systems. What is extraordinary about these computers of the future is that at this level, nature functions extremely and radically differently from our everyday experience of the world and how we know and perceive it.”

Since the start of the quantum race, Microsoft has placed its bets on the elusive but potentially game-changing topological qubit. Now the company claims its Hail Mary has paid off, saying it could build a working processor in less than a decade.

Today’s leading quantum computing companies have predominantly focused on qubits—the quantum equivalent of bits—made out of superconducting electronics, trapped ions, or photons. These devices have achieved impressive milestones in recent years, but are hampered by errors that mean a quantum computer able to outperform classical ones still appears some way off.

Microsoft, on the other hand, has long championed topological quantum computing. Rather than encoding information in the states of individual particles, this approach encodes information in the overarching structure of the system. In theory, that should make the devices considerably more tolerant of background noise from the environment and therefore more or less error-proof.

The teams pitted IBM’s 127-qubit Eagle chip against supercomputers at Lawrence Berkeley National Lab and Purdue University for increasingly complex tasks. With easier calculations, Eagle matched the supercomputers’ results every time—suggesting that even with noise, the quantum computer could generate accurate responses. But where it shone was in its ability to tolerate scale, returning results that are—in theory—far more accurate than what’s possible today with state-of-the-art silicon computer chips.

At the heart is a post-processing technique that decreases noise. Similar to looking at a large painting, the method ignores each brush stroke. Rather, it focuses on small portions of the painting and captures the general “gist” of the artwork.

The study, published in Nature, isn’t chasing quantum advantage, the theory that quantum computers can solve problems faster than conventional computers. Rather, it shows that today’s quantum computers, even when imperfect, may become part of scientific research—and perhaps our lives—sooner than expected. In other words, we’ve now entered the realm of quantum utility.