Toggle light / dark theme

Scientists decipher the magma bodies under Yellowstone

Using supercomputer modeling, University of Oregon scientists have unveiled a new explanation for the geology underlying recent seismic imaging of magma bodies below Yellowstone National Park.

Yellowstone, a supervolcano famous for explosive eruptions, large calderas and extensive lava flows, has for years attracted the attention of scientists trying to understand the location and size of below it. The last caldera forming eruption occurred 630,000 years ago; the last large volume of lava surfaced 70,000 years ago.

Crust below the park is heated and softened by continuous infusions of magma that rise from an anomaly called a , similar to the source of the magma at Hawaii’s Kilauea volcano. Huge amounts of water that fuel the dramatic geysers and hot springs at Yellowstone cool the crust and prevent it from becoming too hot.

Optical computers light up the horizon

Since their invention, computers have become faster and faster, as a result of our ability to increase the number of transistors on a processor chip.

Today, your smartphone is millions of times faster than the computers NASA used to put the first man on the moon in 1969. It even outperforms the most famous supercomputers from the 1990s. However, we are approaching the limits of this electronic technology, and now we see an interesting development: light and lasers are taking over electronics in computers.

Processors can now contain tiny lasers and light detectors, so they can send and receive data through small optical fibres, at speeds far exceeding the we use now. A few companies are even developing optical processors: chips that use laser light and optical switches, instead of currents and electronic transistors, to do calculations.

New algorithm will allow for simulating neural connections of entire brain on future exascale supercomputers

Amazing.


(credit: iStock)

An international team of scientists has developed an algorithm that represents a major step toward simulating neural connections in the entire human brain.

The new algorithm, described in an open-access paper published in Frontiers in Neuroinformatics, is intended to allow simulation of the human brain’s 100 billion interconnected neurons on supercomputers. The work involves researchers at the Jülich Research Centre, Norwegian University of Life Sciences, Aachen University, RIKEN, KTH Royal Institute of Technology, and KTH Royal Institute of Technology.

An open-source neural simulation tool. The algorithm was developed using NEST (“neural simulation tool”) — open-source simulation software in widespread use by the neuroscientific community and a core simulator of the European Human Brain Project. With NEST, the behavior of each neuron in the network is represented by a small number of mathematical equations, the researchers explain in an announcement.

Researchers find algorithm for large-scale brain simulations

An international group of researchers has made a decisive step towards creating the technology to achieve simulations of brain-scale networks on future supercomputers of the exascale class. The breakthrough, published in Frontiers in Neuroinformatics, allows larger parts of the human brain to be represented, using the same amount of computer memory. Simultaneously, the new algorithm significantly speeds up brain simulations on existing supercomputers.

The human brain is an organ of incredible complexity, composed of 100 billion interconnected nerve cells. However, even with the help of the most powerful supercomputers available, it is currently impossible to simulate the exchange of neuronal signals in networks of this size.

“Since 2014, our software can simulate about one percent of the in the human brain with all their connections,” says Markus Diesmann, Director at the Jülich Institute of Neuroscience and Medicine (INM-6). In order to achieve this impressive feat, the software requires the entire main memory of petascale supercomputers, such as the K computer in Kobe and JUQUEEN in Jülich.

The New Religions Obsessed with A.I

How far should we integrate human physiology with technology? What do we do with self-aware androids—like Blade Runner’s replicants—and self-aware supercomputers? Or the merging of our brains with them? If Ray Kurzweil’s famous singularity—a future in which the exponential growth of technology turns into a runaway train—becomes a reality, does religion have something to offer in response?


Yes, not only is A.I. potentially taking all of our jobs, but it’s also changing religion.

Brandon WithrowBrandon Withrow