Toggle light / dark theme

New computational biology for genome sequencing analysis

To improve the ability of metapipeline-DNA to determine where changes in the genome have occurred, the scientists worked with the Genome in a Bottle Consortium led by the U.S. Department of Commerce’s National Institute of Standards and Technology. By incorporating this public-private-academic consortium’s meticulously validated resources, the researchers reduced the rate of false positives without reducing the tool’s precision in finding true genetic variants.

The researchers also produced two case studies demonstrating the pipeline’s capabilities for cancer research. The investigators used metapipeline-DNA to analyze sequencing data from five patients that donated both normal tissue and tumor samples, as well as another five from The Cancer Genome Atlas.

The next step is to get metapipeline-DNA into more labs to accelerate discoveries, and to continue improving the resource with more user feedback. ScienceMission sciencenewshighlights.


In a single experiment, scientists can decipher the entire genomes of many patient samples, animal models or cultured cells. To fully realize the potential to study biology at this unprecedented scale, researchers must be equipped to analyze the titanic troves of data generated by these new methods.

Scientists published findings in Cell Reports Methods discussing building and testing a new computational tool for tackling massive and complex sequencing datasets. The new resource, named metapipeline-DNA, may also make sequencing data analysis more standardized across different research labs.

The sequence of a single human genome represents about 100 gigabytes of raw data, the rough equivalent of 20,000 smartphone photos. The sheer scale of experimental data increases significantly as tens or hundreds of genomes are added into the mix.

Ultrastructural preservation of a whole large mammal brain with a protocol compatible with human physician-assisted death

Ultrastructural Preservation of a Whole Large Mammal Brain (bioRxiv, 2026) ⚠️ Preprint – not yet peer-reviewed.

A 2026 preprint builds on over a decade of brain preservation research, demonstrating that whole mammalian brains (pigs) can be preserved with remarkable structural fidelity under near–real-world, end-of-life conditions.

The study refines aldehyde-stabilized cryopreservation (ASC)—a technique previously recognized by the Brain Preservation Foundation. This method combines chemical fixation (aldehydes), cryoprotectants, and controlled cooling to prevent ice damage and preserve neural structure at the nanoscale. — What the study shows.

Whole pig brains preserved with intact cellular and synaptic architecture.

Preservation remains viable even with delayed postmortem intervals (~10 minutes)

Tissue remains perfusable and structurally stable after fixation.

Protocol moves toward clinically realistic implementation, not just lab conditions.

Physicists just turned glass into a powerful quantum security device

Scientists have turned simple glass into a powerful quantum communication device that could safeguard data against future quantum attacks. The chip combines stability, speed, and versatility—handling both ultra-secure encryption and record-breaking random number generation in one compact system.

Electric current stabilizes spins at unstable points for new types of computing

A research team has discovered a new way to control tiny magnetic properties inside materials using electric current, which could possibly pave the way for new types of computing technologies. The work is based on spintronics, a field that uses not only the electric charge of electrons but also their “spin,” a quantum property that can be thought of as a tiny magnet.

Spintronics is already used in magnetic random access memory (MRAM), a type of memory that keeps data even when the power is turned off. This is different from conventional memory, which loses information without electricity.

In MRAM, data is stored depending on whether spins point “up” or “down.” These two stable states are separated by an energy barrier, which helps keep the data secure. However, this stability also makes it harder to switch between states, requiring strong electric currents.

A Hall ‘rectenna’ can detect signals over a 100 GHz frequency range

Many current wireless communication, imaging and sensing technologies rely on components that convert oscillating electric and magnetic fields (i.e., electromagnetic waves) into electrical signals. Some of the most used components are so-called p-n diodes, semiconducting devices that combine two types of materials with distinct electrical properties.

In conventional diode designs, the conversion of electromagnetic waves into electrical signals relies on the nonlinear transport of electrons. This means that the electric current in the devices does not change proportionally with the voltage applied, which allows them to rectify signals (i.e., convert alternating current into direct current) and combine signals with different frequencies.

A key limitation of traditional diodes is that thermal effects introduce noise, causing electrons to move randomly and making weak signals harder to detect. Moreover, electrons typically take a finite time to travel across the device, also known as the transit time, which limits the performance of the diodes at very high frequencies.

Superconducting quantum processor performs well with significantly less wiring

Quantum computers, computing systems that process information using quantum mechanical effects, could outperform classical computers on some computational tasks. These computers rely on qubits, the basic units of quantum information, which can exist in multiple states (0, 1 or both simultaneously), due to quantum effects known as superposition and entanglement.

Many of the quantum computers developed in recent years are based on conventional superconductors, materials that exhibit an electrical resistance of zero at extremely low temperatures. To operate reliably and exhibit superconductivity, circuits based on these materials need to be cooled down to millikelvin temperatures.

In quantum computers, each qubit typically requires its own control line. This means that engineers need to introduce several wires that carry electrical pulses (i.e., signal lines), and the number of necessary wires increases with the number of qubits. As quantum computers grow larger, this can be problematic, as processors become harder to build and reliably operate.

Quantum computers could have a fundamental limit after all

The performance of quantum computers could cap out after around 1,000 qubits, according to a new analysis published in the Proceedings of the National Academy of Sciences. Through new calculations, Tim Palmer at the University of Oxford has reconsidered the mathematical foundations underlying the quantum principles behind the technology, concluding that restrictions on the information-carrying capacity of large quantum systems could make their computing power far more limited than many researchers predict.

For some time, quantum physicists have been growing increasingly excited—and concerned—about the seemingly limitless potential of quantum computers. In a classical computer, information content generally grows linearly as the number of bits increases. But in a quantum computer, each extra qubit doubles the number of quantum states the system can occupy.

Since these states can encode multiple possibilities at the same time, the overall system appears to become exponentially more powerful with each added qubit—at least according to our current understanding of quantum mechanics.

Superconducting chip generates tunable terahertz waves for compact imaging

A tiny crystal chip which uses terahertz radiation to see clearly through a wide range of materials could find applications in health care, biological research, and security screening. Researchers from Scotland and Japan have developed a lightweight superconducting chip, which they say could unlock the full potential of terahertz imaging technologies and lead to the development of more powerful and portable devices.

The team’s paper, titled “Terahertz Imaging System with On-Chip Superconducting Josephson Plasma Emitters for Nondestructive Testing,” is published in IEEE Transactions on Applied Superconductivity.

Terahertz radiation lies between the microwave and infrared frequencies of the electromagnetic spectrum. It passes easily and harmlessly through a wide range of materials, and can be used to identify the characteristic “fingerprint” of molecules and biological materials as it does so, allowing them to be detected and analyzed.

LeWorldModel: Stable End-to-End Joint-Embedding

Joint Embedding Predictive Architectures (JEPAs) offer a compelling framework for learning world models in compact latent spaces, yet existing methods remain fragile, relying on complex multi-term losses, exponential moving averages, pre-trained encoders, or auxiliary supervision to avoid representation collapse. In this work, we introduce LeWorldModel (LeWM), the first JEPA that trains stably end-to-end from raw pixels using only two loss terms: a next-embedding prediction loss and a regularizer enforcing Gaussian-distributed latent embeddings. This reduces tunable loss hyperparameters from six to one compared to the only existing end-to-end alternative. With ~15M parameters trainable on a single GPU in a few hours, LeWM plans up to 48× faster than foundation-model-based world models while remaining competitive across diverse 2D and 3D control tasks. Beyond control, we show that LeWM’s latent space encodes meaningful physical structure through probing of physical quantities. Surprise evaluation confirms that the model reliably detects physically implausible events.

TL;DR: LeWM is a JEPA-based world model that avoids representation collapse using a simple Gaussian regularizer (SIGReg), trains end-to-end from pixels with only two loss terms, and achieves competitive control performance at a fraction of the compute cost.

/* */