Whenever we mull over what film to watch on Netflix, or deliberate between different products on an e-commerce platform, the gears of recommendation algorithms spin under the hood. These systems sort through sprawling datasets to deliver personalized suggestions. However, as data becomes richer and more interconnected, today’s algorithms struggle to keep pace with capturing relationships that span more than just pairs, such as group ratings, cross-category tags, or interactions shaped by time and context.
A team of researchers led by Professor Kavan Modi from the Singapore University of Technology and Design (SUTD) has taken a conceptual leap into this complexity by developing a new quantum framework for analyzing higher-order network data.
Their work centers on a mathematical field called topological signal processing (TSP), which encodes more than connections between pairs of points but also among triplets, quadruplets, and beyond. Here, “signals” are information that lives on higher-dimensional shapes (triangles or tetrahedra) embedded in a network.
Neural networks revolutionized machine learning for classical computers: self-driving cars, language translation and even artificial intelligence software were all made possible. It is no wonder, then, that researchers wanted to transfer this same power to quantum computers—but all attempts to do so brought unforeseen problems.
Recently, however, a team at Los Alamos National Laboratory developed a new way to bring these same mathematical concepts to quantum computers by leveraging something called the Gaussian process.
“Our goal for this project was to see if we could prove that genuine quantum Gaussian processes exist,” said Marco Cerezo, the Los Alamos team’s lead scientist. “Such a result would spur innovations and new forms of performing quantum machine learning.”
After finding the homeschooling life confining, the teen petitioned her way into a graduate class at Berkeley, where she ended up disproving a 40-year-old conjecture.
A new study published in July 2025 tackles one of science’s most profound mysteries—how did life first emerge from nonliving matter on early Earth? Using cutting edge mathematical approaches, researcher Robert G. Endres from Imperial College London has developed a framework that suggests the spontaneous origin of life faces far greater challenges than previously understood.
Joint research led by Sosuke Ito of the University of Tokyo has shown that nonequilibrium thermodynamics, a branch of physics that deals with constantly changing systems, explains why optimal transport theory, a mathematical framework for the optimal change of distribution to reduce cost, makes generative models optimal. As nonequilibrium thermodynamics has yet to be fully leveraged in designing generative models, the discovery offers a novel thermodynamic approach to machine learning research. The findings were published in the journal Physical Review X.
Image generation has been improving in leaps and bounds over recent years: a video of a celebrity eating a bowl of spaghetti that represented the state of the art a couple of years ago would not even qualify as good today. The algorithms that power image generation are called diffusion models, and they contain randomness called “noise.”
During the training process, noise is introduced to the original data through diffusion dynamics. During the generation process, the model must eliminate the noise to generate new content from the noisy data. This is achieved by considering the time-reversed dynamics, as if playing the video in reverse. One piece of the art and science of building a model that produces high-quality content is specifying when and how much noise is added to the data.
The time delay experienced by a scattered light signal has an imaginary part that was considered unobservable, but researchers have isolated its effect in a frequency shift.
A scattering material, such as a frosted window or a thin fog, will cause light to travel slower than it would if no material were in its path. The mathematical formula for this time delay has a real part—which is well studied—and a lesser-known imaginary part. “The imaginary time delay has been largely ignored and disregarded as unphysical,” says Isabella Giovannelli from the University of Maryland. But she and her advisor Steven Anlage have now measured this abstract quantity by recording a corresponding frequency shift in scattered light pulses [1].
The real part of the time delay has been observed in many experiments, particularly slow-light setups where light pulses can become effectively trapped inside a scattering medium (see Focus: Light Nearly Stopped in a Waveguide). By contrast, the imaginary part has been stuck in the realm of mathematics. Theoretical work from 2016, however, showed that the imaginary time delay can be related to a potentially observable frequency shift [2].
Using mathematical analysis of patterns of human and animal cell behavior, scientists say they have developed a computer program that mimics the behavior of such cells in any part of the body. Led by investigators at Indiana University, Johns Hopkins Medicine, the University of Maryland School of Medicine and Oregon Health & Science University, the new work was designed to advance ways of testing and predicting biological processes, drug responses and other cell dynamics before undertaking more costly experiments with live cells.
With further work on the program, the researchers say it could eventually serve as a “digital twin” for testing any drug’s effect on cancer or other conditions, gene environment interactions during brain development, or any number of dynamic cellular molecular processes in people where such studies are not possible.
Funded primarily by the Jayne Koskinas Ted Giovanis Foundation and the National Institutes of Health, and leveraging prior knowledge and data funded by the Lustgarten Foundation and National Foundation for Cancer Research, the new study and examples of cell simulations are described online July 25 in the journal Cell.
Is Chief Executive Officer and Chairman of the Board of BioStem Technologies (https://biostemtechnologies.com/), a leading innovator focused on harnessing the natural properties of perinatal tissue in the development, manufacture, and commercialization of allografts for regenerative therapies.
Jason brings a wealth of experience in strategic operations planning and technical projects management from his rigorous technical background. His diverse expertise includes continuous process improvement, training and development programs, regulatory compliance and best practices implementation, and advanced problem solving.
Jason began his career as a technical engineer working for Adecco at SC Johnson in 2009, where he developed comprehensive maintenance plans to support manufacturing processes at scale. He then transitioned to manufacturing and quality engineering for major organizations, including ATI Ladish Forging, Nemak, and HUSCO International, where he spearheaded process design and implementation, solved complex supply-chain and manufacturing problems, and improved product sourcing and purchasing.
Jason’s philanthropic work with the Juvenile Diabetes Research Foundation sparked an interest in biotech, leading him to co-found Biostem Technologies in 2014. As CEO he has leveraged his expertise to optimize tissue sourcing, strategically build out a 6,000 square foot tissue processing facility that is fully compliant with FDA 210,211, 1,271, and AATB standards, and put together an expert team of professionals to support the company’s continued growth.
Jason holds a B.S. in Mechanical Engineering Technology and a minor in Mathematics from the Milwaukee School of Engineering and is Six Sigma Black Belt certified. He also serves as a Processing and Distribution Council Member for the American Association of Tissue Banks (AATB), as well as serves as a member of the Government Affairs committee for BioFlorida.