Toggle light / dark theme

Analyzing massive datasets from nuclear physics experiments can take hours or days to process, but researchers are working to radically reduce that time to mere seconds using special software being developed at the Department of Energy’s Lawrence Berkeley and Oak Ridge national laboratories.

DELERIA—short for Distributed Event-Level Experiment Readout and Integrated Analysis—is a novel software platform designed specifically to support the GRETA spectrometer, a cutting-edge instrument for nuclear physics experiments. The Gamma Ray Energy Tracking Array (GRETA), is currently under construction at Berkeley Lab and is scheduled to be installed in 2026 at the Facility for Rare Isotope Beams (FRIB), at Michigan State University.

The software will enable GRETA to stream data directly to the nation’s leading computing centers with the goal of analyzing large datasets in seconds. The data will be sent via the Energy Sciences Network, or ESnet. This will allow researchers to make critical adjustments to the experiment as it is taking place, leading to increased scientific productivity with significantly faster, more accurate results.

Can AI speed up aspects of the scientific process? Microsoft appears to think so.

At the company’s Build 2025 conference on Monday, Microsoft announced Microsoft Discovery, a platform that taps agentic AI to “transform the [scientific] discovery process,” according to a press release provided to TechCrunch. Microsoft Discovery is “extensible,” Microsoft says, and can handle certain science-related workloads “end-to-end.”

“Microsoft Discovery is an enterprise agentic platform that helps accelerate research and discovery by transforming the entire discovery process with agentic AI — from scientific knowledge reasoning to hypothesis formulation, candidate generation, and simulation and analysis,” explains Microsoft in its release. “The platform enables scientists and researchers to collaborate with a team of specialized AI agents to help drive scientific outcomes with speed, scale, and accuracy using the latest innovations in AI and supercomputing.”

China has begun launching satellites for a giant computer network in space, according to the China Aerospace Science and Technology Corporation.

Newsweek contacted the company and the United States Space Force for comment.

Why It Matters

Space is an increasing frontier for competition between China and the United States. Putting a computer network in space marks a step change from using satellites for sensing and communications, but leaving them dependent on their connections to Earth for data processing.

Quantum annealing is a specific type of quantum computing that can use quantum physics principles to find high-quality solutions to difficult optimization problems. Rather than requiring exact optimal solutions, the study focused on finding solutions within a certain percentage (≥1%) of the optimal value.

Many real-world problems don’t require exact solutions, making this approach practically relevant. For example, in determining which stocks to put into a mutual fund, it is often good enough to just beat a leading market index rather than beating every other stock portfolio.

Astronomers have developed a computer simulation to explore, in unprecedented detail, magnetism and turbulence in the interstellar medium (ISM)—the vast ocean of gas and charged particles that lies between stars in the Milky Way galaxy.

Described in a study published in Nature Astronomy, the model is the most powerful to date, requiring the computing capability of the SuperMUC-NG supercomputer at the Leibniz Supercomputing Center in Germany. It directly challenges our understanding of how magnetized turbulence operates in astrophysical environments.

James Beattie, the paper’s lead author and a postdoctoral researcher at the Canadian Institute for Theoretical Astrophysics (CITA) at the University of Toronto, is hopeful the model will provide new insights into the ISM, the magnetism of the Milky Way galaxy as a whole, and astrophysical phenomena such as star formation and the propagation of cosmic rays.

Tesla is developing a terawatt-level supercomputer at Giga Texas to enhance its self-driving technology and AI capabilities, positioning the company as a leader in the automotive and renewable energy sectors despite current challenges ## ## Questions to inspire discussion.

Tesla’s Supercomputers.

💡 Q: What is the scale of Tesla’s new supercomputer project?

A: Tesla’s Cortex 2 supercomputer at Giga Texas aims for 1 terawatt of compute with 1.4 billion GPUs, making it 3,300x bigger than today’s top system.

💡 Q: How does Tesla’s compute power compare to Chinese competitors?

A: Tesla’s FSD uses 3x more compute than Huawei, Xpeng, Xiaomi, and Li Auto combined, with BYD not yet a significant competitor. Full Self-Driving (FSD)

XAI’s Colossus supercomputer is set to revolutionize AI technology and significantly enhance Tesla’s capabilities in self-driving, energy reliability, and factory operations through its rapid expansion and innovative partnerships.

Questions to inspire discussion.

AI Supercomputing.
🖥️ Q: What is XAI’s Colossus data center’s current capacity? A: XAI’s Colossus data center is now fully operational for Phase 1 with 300,000 H100 equivalents, powered by 150 MW from the grid and 150 MW in Tesla Megapacks.

Computer simulations help materials scientists and biochemists study the motion of macromolecules, advancing the development of new drugs and sustainable materials. However, these simulations pose a challenge for even the most powerful supercomputers.

A University of Oregon graduate student has developed a new mathematical equation that significantly improves the accuracy of the simplified computer models used to study the motion and behavior of large molecules such as proteins, and synthetic materials such as plastics.

The breakthrough, published last month in Physical Review Letters, enhances researchers’ ability to investigate the motion of large molecules in complex biological processes, such as DNA replication. It could aid in understanding diseases linked to errors in such replication, potentially leading to new diagnostic and therapeutic strategies.