Toggle light / dark theme

The feature image you see above was generated by an AI text-to-image rendering model called Stable Diffusion typically runs in the cloud via a web browser, and is driven by data center servers with big power budgets and a ton of silicon horsepower. However, the image above was generated by Stable Diffusion running on a smartphone, without a connection to that cloud data center and running in airplane mode, with no connectivity whatsoever. And the AI model rendering it was powered by a Qualcomm Snapdragon 8 Gen 2 mobile chip on a device that operates at under 7 watts or so.

It took Stable Diffusion only a few short phrases and 14.47 seconds to render this image.


This is an example of a 540p pixel input resolution image being scaled up to 4K resolution, which results in much cleaner lines, sharper textures, and a better overall experience. Though Qualcomm has a non-algorithmic version of this available today, called Snapdragon GSR, someday in the future, mobile enthusiast gamers are going to be treated to even better levels of image quality without sacrificing battery life and with even higher frame rates.

This is just one example of gaming and media enhancement with pre-trained and quantized machine learning models, but you can quickly think of a myriad of applications that could benefit greatly, from recommendation engines to location-aware guidance, to computational photography techniques and more.

We just needed a new math for all this AI heavy lifting on smartphones and other lower power edge devices, and it appears Qualcomm is leading that charge.

The exotic particles are called non-Abelian anyons, or nonabelions for short, and their Borromean rings exist only as information inside the quantum computer. But their linking properties could help to make quantum computers less error-prone, or more ‘fault-tolerant’ — a key step to making them outperform even the best conventional computers. The results, revealed in a preprint on 9 May1, were obtained on a machine at Quantinuum, a quantum-computing company in Broomfield, Colorado, that formed as the result of a merger between the quantum computing unit of Honeywell and a start-up firm based in Cambridge, UK.

“This is the credible path to fault-tolerant quantum computing,” says Tony Uttley, Quantinuum’s president and chief operating officer.

Other researchers are less optimistic about the virtual nonabelions’ potential to revolutionize quantum computing, but creating them is seen as an achievement in itself. “There is enormous mathematical beauty in this type of physical system, and it’s incredible to see them realized for the first time, after a long time,” says Steven Simon, a theoretical physicist at the University of Oxford, UK.

Mathematicians have uncovered a universal explanatory framework that provides a “window into evolution.” This framework explains how molecules interact with each other in adapting to changing conditions while still maintaining tight control over essential properties that are crucial for survival.

According to Dr. Araujo from the QUT School of Mathematical Sciences, the research results provide a blueprint for the creation of signaling networks that are capable of adapting across all life forms and for the design of synthetic biological systems.

“Our study considers a process called robust perfect adaptation (RPA) whereby biological systems, from individual cells to entire organisms, maintain important molecules within narrow concentration ranges despite continually being bombarded with disturbances to the system,” Dr. Araujo said.

A new study published in Human Brain Mapping revealed that long-term musical training can modify the connectivity networks in the brain’s white matter.

Previous research has shown that intense musical training induces structural neuroplasticity in different brain regions. However, previous studies mainly investigated brain changes in instrumental musicians, and little is known about how structural connectivity in non-instrumental musicians is affected by long-term training.

To examine how the connections between different parts of the brain might be affected by long-term vocal training, the researchers of the study used graph theory and diffusion-weighted images. Graph theory is a mathematical framework used to study the networks’ architecture in the human brain, while diffusion-weighted imaging is an MRI technique that measures the diffusion of water molecules in tissues, providing information on the structural connectivity of the brain.

Defining computational neuroscience The evolution of computational neuroscience Computational neuroscience in the twenty-first century Some examples of computational neuroscience The SpiNNaker supercomputer Frontiers in computational neuroscience References Further reading

The human brain is a complex and unfathomable supercomputer. How it works is one of the ultimate mysteries of our time. Scientists working in the exciting field of computational neuroscience seek to unravel this mystery and, in the process, help solve problems in diverse research fields, from Artificial Intelligence (AI) to psychiatry.

Computational neuroscience is a highly interdisciplinary and thriving branch of neuroscience that uses computational simulations and mathematical models to develop our understanding of the brain. Here we look at: what computational neuroscience is, how it has grown over the last thirty years, what its applications are, and where it is going.

Worms can entangle themselves into a single, giant knot, only to quickly unravel themselves from the tightly wound mess within milliseconds. Now, math shows how they do it.

Researchers studied California blackworms (Lumbriculus variegatus) — thin worms that can grow to be 4 inches (10 centimeters) in length — in the lab, watching as the worms intertwined by the thousands. Even though it took the worms minutes to form into a ball-shaped blob akin to a snarled tangle of Christmas lights, they could untangle from the jumble in the blink of an eye when threatened, according to a study published April 28 in the journal Science (opens in new tab).

It was Arthur C. Clarke who famously said that “Any sufficiently advanced technology is indistinguishable from magic” (although I’d argue that Jack Kirby and Jim Starlin rather perfected the idea). Now, a group of real-life scientists at the RIKEN Interdisciplinary Theoretical and Mathematical Sciences in Japan have taken it a step further: by identifying a new quantum property to measure the weirdness of spacetime, and officially calling it “magic.” From the scientific paper “Probing chaos by magic monotones,” recently published in the journal Physical Review D: