The CL1 computer is the first in the world that combines human neurons with a silicon chip. It could be used in disease modeling and drug discovery before it expires after six months.
Category: computing
Four physicists at the Hebrew University of Jerusalem, in Israel, have unraveled the mechanical process behind the growth of roses as they blossom into their unique shape. In their study published in the journal Science, Yafei Zhang, Omri Cohen, Michael Moshe and Eran Sharon adopted a multipronged approach to learn the secrets behind rose blossom growth. Qinghao Cui and Lishuai Jin, with the University of Hong Kong have published a Perspective piece in the same journal issue outlining the work.
Roses have been prized for their beauty and sweet aromas for thousands of years, but until now, the mechanics behind rose growth have not been explored. To gain a better understanding of the process, the research team undertook a three-pronged approach. First, they conducted a theoretical analysis of the process. Then they created computer models to simulate the ways the flowers might grow and bloom; finally, they created real-world bendable plastic disks to simulate petals and the possible ways they could grow given the constraints of real roses.
They found that the shape of the petals is strongly influenced by the frustration known as the Mainardi-Codazzi-Peterson incompatibility, in which geometric compatibility conditions inherent on a surface made of a particular material are violated, leading to forces that generate rolling and sharp edges.
Photographer Stephen Voss has been working on a project about data centers and recently travelled to Abilene, Texas to document the first data center built as part of the Stargate Project. When completed, it will be the largest data center in the world. Here’s a short drone video he took of the project:
“The place was mesmerizing and deeply unsettling,” Voss told me over email. “When finished, it’ll have the power demands of a mid-sized city and is on a piece of land that’s the size of Central Park.”
A new MIT-designed circuit achieves record-setting nonlinear coupling, allowing quantum operations to occur dramatically faster.
The heart of this advance is the “quarton coupler,” which boosts both light-matter and matter-matter interactions. This progress could lead to quicker quantum readouts, crucial for error correction and computation fidelity.
Unlocking Quantum Computing’s Speed Potential.
Most people’s experiences with polynomial equations don’t extend much further than high school algebra and the quadratic formula. Still, these numeric puzzles remain a foundational component of everything from calculating planetary orbits to computer programming. Although solving lower order polynomials—where the x in an equation is raised up to the fourth power—is often a simple task, things get complicated once you start seeing powers of five or greater. For centuries, mathematicians accepted this as simply an inherent challenge to their work, but not Norman Wildberger. According to his new approach detailed in The American Mathematical Monthly, there’s a much more elegant approach to high order polynomials—all you need to do is get rid of pesky notions like irrational numbers.
Babylonians first conceived of two-degree polynomials around 1800 BCE, but it took until the 16th century for mathematicians to evolve the concept to incorporate three-and four-degree variables using root numbers, also known as radicals. Polynomials remained there for another two centuries, with larger examples stumping experts until in 1832. That year, French mathematician Évariste Galois finally illustrated why this was such a problem—the underlying mathematical symmetry in the established methods for lower-order polynomials simply became too complicated for degree five or higher. For Galois, this meant there just wasn’t a general formula available for them.
Mathematicians have since developed approximate solutions, but they require integrating concepts like irrational numbers into the classical formula.
What happens when trailblazing engineers and industry professionals team up? The answer may transform the future of computing efficiency for modern data centers.
Data centers house and use large computers to run massive amounts of data. Oftentimes, the processors can’t keep up with this workload because it’s taxing to predict and prepare instructions to carry out. This slows the flow of data. Thus, when you type a question into a search engine, the answer generates more slowly or doesn’t provide the information you need.
To remedy this issue, researchers at Texas A&M University developed a new technique called Skia in collaboration with Intel, AheadComputing, and Princeton to help computer processors better predict future instructions and improve computing performance.
Alan Turing was a pioneer in the field of computer science. One of the things he is famous for is the Turing test. At its core, this is a test about whether or not a machine, a computer, can conv…
Researchers have achieved a crucial milestone in quantum computing. They have created an operating system capable of enabling communication between quantum computers using different technologies.
This system, named QNodeOS, represents a significant advancement for quantum machine interoperability. Unlike classical systems like Windows or iOS, it is designed to handle the unique complexity of qubits, regardless of their physical nature. This innovation paves the way for more flexible and powerful quantum networks.
Scientists have achieved a major leap in quantum technology by deriving an exact mathematical expression crucial for refining noisy quantum entanglement into the pure states needed for advanced quantum computing and communication. Their work revisits and corrects flawed theories from two decades