Toggle light / dark theme

The Data Center Boom Reshaping Williamson County, Texas

Williamson County is at the center of one of the most significant data center buildouts in the United States. What started as a handful of projects near Samsung’s Taylor semiconductor fabrication plant has become a full-scale infrastructure rush.

According to a March 2026 Propmodo analysis using Cushman & Wakefield data, the Austin–San Antonio data center corridor now has 7,823 megawatts of planned capacity compared to just 1,154 megawatts currently operating. More than 70 projects are being tracked between Temple and San Antonio, with Williamson County capturing a disproportionate share due to its power infrastructure, fiber connectivity, and available land. Of the 615 megawatts under construction in the corridor, 96 percent is already pre-leased, a remarkable indicator of demand.

A Texas A&M Real Estate Research Center analysis found that between 2023 and 2024, Central Texas experienced a drastic increase in data center construction, totaling 463.5 megawatts of potential demand under development. That report specifically cited marquee projects in Williamson County as having reshaped regional land markets. Texas overall has 408 data centers listed statewide, second most in the nation, with the Austin market at 46 and climbing fast.

Optogenetics, Biohybrid Implants And The Future Of Brain-Computer Interfaces | Dr. Alan Mardinly

Optogenetics, Biohybrid Implants And The Future Of Brain-Computer Interfaces — Dr. Alan Mardinly Ph.D. — CSO & Co-Founder, Science


What if we could restore vision, communicate directly with the brain, and even extend human life—not with machines alone, but with living, engineered biology?

Dr. Alan Mardinly, Ph.D. is the Chief Scientific Officer and Co-Founder of Science Corp. (https://science.xyz/), a neurotechnology company developing next-generation brain interfaces and biohybrid neural implants aimed at restoring human function.

Dr. Mardinly leads the company’s biohybrid program, focused on combining genetically engineered cells with advanced optical hardware to create optogenetic therapies for vision restoration and new types of brain-machine interfaces.

Dr. Mardinly has spent more than 15 years working at the intersection of neuroscience, genetics, and neural engineering.

Quantum computing without interruptions

Mid-circuit measurements are one of the biggest practical hurdles in quantum error correction on encoded qubits. Researchers in Innsbruck and Aachen have now proposed and experimentally demonstrated that a universal fault-tolerant quantum algorithm can be executed without such measurements. Using a trapped-ion quantum processor, the team successfully ran Grover’s quantum search algorithm on three logical qubits.

A key bottleneck in today’s leading approaches to quantum error correction is the need to repeatedly pause and measure the quantum processor mid-computation, a process that is slow, technically demanding, and itself a significant source of errors.

Now, a joint team from the University of Innsbruck, RWTH Aachen University, Forschungszentrum Jülich and spin-off Alpine Quantum Technologies (AQT) has demonstrated fault-tolerant quantum computation without any such interruptions.

Mechanical inputs boost diamond quantum sensor states as Q factor tops one million

Most people think of diamonds as high-end adornments. Not Ania Bleszynski Jayich. The UC Santa Barbara physicist sees diamonds, which she grows in the UC Quantum Foundry, as a potentially powerful foundation for quantum sensors. Sensors are currently much farther along in their development than other potential quantum applications. Diamond sensors are particularly promising because diamonds require relatively few quantum bits (qubits) to operate, whereas a quantum computer, for instance, requires more than 100,000, perhaps as many as a million, qubits to handle error correction, one of the main hurdles for quantum computing.

A paper about the latest advance from the Bleszynski Jayich lab, “Spin-embedded diamond optomechanical resonator with a mechanical quality factor exceeding one million,” has been published in the journal Optica.

New GPUBreach attack enables system takeover via GPU rowhammer

A new attack, dubbed GPUBreach, can induce Rowhammer bit-flips on GPU GDDR6 memories to escalate privileges and lead to a full system compromise.

GPUBreach was developed by a team of researchers at the University of Toronto, and full details will be presented at the upcoming IEEE Symposium on Security & Privacy on April 13 in Oakland.

The researchers demonstrated that Rowhammer-induced bit flips in GDDR6 can corrupt GPU page tables (PTEs) and grant arbitrary GPU memory read/write access to an unprivileged CUDA kernel.

Linux devs start removing support for 37-year-old Intel 486 CPU — head honcho Linus Torvalds says ‘zero real reason’ to continue support

Perhaps it is time to send your 37-year-old Intel 486 system into retirement, as far as modern Linux goes, as OS kernel developers appear to have started to dismantle support for this legendary CPU. Phoronix reports that the change seems to have been confirmed in patches destined for the Linux 7.1 kernel. So, those still cherishing their 486 PCs and using them to run a modern version of Linux should probably now make sure they run one of the existing Linux LTS kernels to squeeze a few more years from the platform. Alternatively, they could upgrade to a Pentium or even one of the best CPUs available in 2026.

The patching out of 486 support isn’t really a surprise. Firstly, it is ancient, with the first examples released in 1989, and modern Linux distros continue to grow more resource-hungry. Secondly, Linux creator Linus Torvalds hinted not long ago that 486 support may get the axe. The Linux mogul said that there was “zero real reason” to continue support for the 486 CPU. In fact, he indicated that continuing support for it was detrimental to upstream Linux kernel development efforts.

Dozens of hidden star streams found in the outskirts of our Milky Way galaxy

To find them, Chen developed a computer algorithm called StarStream, which searches for streams using a physics-based model rather than relying on visual patterns alone, according to the study. The team then applied the method to Gaia data, which from 2014 to 2025 mapped the positions and motions of billions of stars in the Milky Way.

“It turns out that it’s a lot easier to find things when you have a theoretical expectation of what you’re looking for when you have a simple phenomenological picture,” Gnedin said in the statement.

The results also revealed that many streams do not match the classic expectation of thin, well-aligned trails. Instead, the study reports that some of the newfound streams are shorter, wider or even misaligned with their parent clusters’ orbits — suggesting earlier searches may have missed them by focusing only on the most obvious structures.

/* */