Toggle light / dark theme

In their pursuit of understanding cosmic evolution, scientists rely on a two-pronged approach. Using advanced instruments, astronomical surveys attempt to look farther and farther into space (and back in time) to study the earliest periods of the Universe. At the same time, scientists create simulations that attempt to model how the Universe has evolved based on our understanding of physics. When the two match, astrophysicists and cosmologists know they are on the right track!

In recent years, increasingly-detailed simulations have been made using increasingly sophisticated supercomputers, which have yielded increasingly accurate results. Recently, an international team of researchers led by the University of Helsinki conducted the most accurate simulations to date. Known as SIBELIUS-DARK, these simulations accurately predicted the evolution of our corner of the cosmos from the Big Bang to the present day.

In addition to the University of Helsinki, the team was comprised of researchers from the Institute for Computational Cosmology (ICC) and the Centre for Extragalactic Astronomy at Durham University, the Lorentz Institute for Theoretical Physics at Leiden University, the Institut d’Astrophysique de Paris, and The Oskar Klein Centre at Stockholm University. The team’s results are published in the Monthly Notices of the Royal Astronomical Society.

Circa 2020 Simulation of the human brain.


Computer simulation of the human brain at an individual neuron resolution is an ultimate goal of computational neuroscience. The Japanese flagship supercomputer, K, provides unprecedented computational capability toward this goal. The cerebellum contains 80% of the neurons in the whole brain. Therefore, computer simulation of the human-scale cerebellum will be a challenge for modern supercomputers. In this study, we built a human-scale spiking network model of the cerebellum, composed of 68 billion spiking neurons, on the K computer. As a benchmark, we performed a computer simulation of a cerebellum-dependent eye movement task known as the optokinetic response. We succeeded in reproducing plausible neuronal activity patterns that are observed experimentally in animals. The model was built on dedicated neural network simulation software called MONET (Millefeuille-like Organization NEural neTwork), which calculates layered sheet types of neural networks with parallelization by tile partitioning. To examine the scalability of the MONET simulator, we repeatedly performed simulations while changing the number of compute nodes from 1,024 to 82,944 and measured the computational time. We observed a good weak-scaling property for our cerebellar network model. Using all 82,944 nodes, we succeeded in simulating a human-scale cerebellum for the first time, although the simulation was 578 times slower than the wall clock time. These results suggest that the K computer is already capable of creating a simulation of a human-scale cerebellar model with the aid of the MONET simulator.

Computer simulation of the whole human brain is an ambitious challenge in the field of computational neuroscience and high-performance computing (Izhikevich, 2005; Izhikevich and Edelman, 2008; Amunts et al., 2016). The human brain contains approximately 100 billion neurons. While the cerebral cortex occupies 82% of the brain mass, it contains only 19% (16 billion) of all neurons. The cerebellum, which occupies only 10% of the brain mass, contains 80% (69 billion) of all neurons (Herculano-Houzel, 2009). Thus, we could say that 80% of human-scale whole brain simulation will be accomplished when a human-scale cerebellum is built and simulated on a computer. The human cerebellum plays crucial roles not only in motor control and learning (Ito, 1984, 2000) but also in cognitive tasks (Ito, 2012; Buckner, 2013). In particular, the human cerebellum seems to be involved in human-specific tasks, such as bipedal locomotion, natural language processing, and use of tools (Lieberman, 2014).

Physicists searching—unsuccessfully—for today’s most favored candidate for dark matter, the axion, have been looking in the wrong place, according to a new supercomputer simulation of how axions were produced shortly after the Big Bang 13.6 billion years ago.

Using new calculational techniques and one of the world’s largest computers, Benjamin Safdi, assistant professor of physics at the University of California, Berkeley; Malte Buschmann, a postdoctoral research associate at Princeton University; and colleagues at MIT and Lawrence Berkeley National Laboratory simulated the era when axions would have been produced, approximately a billionth of a billionth of a billionth of a second after the universe came into existence and after the epoch of cosmic inflation.

The at Berkeley Lab’s National Research Scientific Computing Center (NERSC) found the ’s to be more than twice as big as theorists and experimenters have thought: between 40 and 180 microelectron volts (micro-eV, or μeV), or about one 10-billionth the mass of the electron. There are indications, Safdi said, that the mass is close to 65 μeV. Since physicists began looking for the axion 40 years ago, estimates of the mass have ranged widely, from a few μeV to 500 μeV.

Building a better supercomputer is something many tech companies, research outfits, and government agencies have been trying to do over the decades. There’s one physical constraint they’ve been unable to avoid, though: conducting electricity for supercomputing is expensive.

Not in an economic sense—although, yes, in an economic sense, too—but in terms of energy. The more electricity you conduct, the more resistance you create (electricians and physics majors, forgive me), which means more wasted energy in the form of heat and vibration. And you can’t let things get too hot, so you have to expend more energy to cool down your circuits.

Instead of relying on a fixed catalogue of available materials or undergoing trial-and-error attempts to come up with new ones, engineers can turn to algorithms running in supercomputers to design unique materials, based on a “materials genome,” with properties tailored to specific needs. Among the new classes of emerging materials are “transient” electronics and bioelectronics that portend applications and industries comparable to the scale that followed the advent of silicon-based electronics.

In each of the three technological spheres, we find the Cloud increasingly woven into the fabric of innovation. The Cloud itself is, synergistically, evolving and expanding from the advances in new materials and machines, creating a virtuous circle of self-amplifying progress. It is a unique feature of our emerging century that constitutes a catalyst for innovation and productivity, the likes of which the world has never seen.

Quantum computers could cause unprecedented disruption in both good and bad ways, from cracking the encryption that secures our data to solving some of chemistry’s most intractable puzzles. New research has given us more clarity about when that might happen.

Modern encryption schemes rely on fiendishly difficult math problems that would take even the largest supercomputers centuries to crack. But the unique capabilities of a quantum computer mean that at sufficient size and power these problems become simple, rendering today’s encryption useless.

That’s a big problem for cybersecurity, and it also poses a major challenge for cryptocurrencies, which use cryptographic keys to secure transactions. If someone could crack the underlying encryption scheme used by Bitcoin, for instance, they would be able to falsify these keys and alter transactions to steal coins or carry out other fraudulent activity.

Meta has just revealed their AI Supercomputer which is surpassing any of its competitors in terms of capabilities and performance. Meta AI Research is using data from sites such as Facebook and Instagram to train and improve its models in the hopes of controlling and influencing its users and for other future secret projects. What other dystopian things will come from this, one can only imagine.

TIMESTAMPS:
00:00 Meta’s Secret Weapon.
01:41 The Emergence of AI Supremacy.
04:55 What are Supercomputers used for?
08:03 Is Human AI Possible?
10:34 Last Words.

#meta #supercomputer #dystopia

Though Meta didn’t give numbers on RSC’s current top speed, in terms of raw processing power it appears comparable to the Perlmutter supercomputer, ranked fifth fastest in the world. At the moment, RSC runs on 6,800 NVIDIA A100 graphics processing units (GPUs), a specialized chip once limited to gaming but now used more widely, especially in AI. Already, the machine is processing computer vision workflows 20 times faster and large language models (like, GPT-3) 3 times faster. The more quickly a company can train models, the more it can complete and further improve in any given year.

In addition to pure speed, RSC will give Meta the ability to train algorithms on its massive hoard of user data. In a blog post, the company said that they previously trained AI on public, open-source datasets, but RSC will use real-world, user-generated data from Meta’s production servers. This detail may make more than a few people blanch, given the numerous privacy and security controversies Meta has faced in recent years. In the post, the company took pains to note the data will be carefully anonymized and encrypted end-to-end. And, they said, RSC won’t have any direct connection to the larger internet.

To accommodate Meta’s enormous training data sets and further increase training speed, the installation will grow to include 16,000 GPUs and an exabyte of storage—equivalent to 36,000 years of high-quality video—later this year. Once complete, Meta says RSC will serve training data at 16 terabytes per second and operate at a top speed of 5 exaflops.

Quantum researchers at the University of Bristol have dramatically reduced the time to simulate an optical quantum computer, with a speedup of around one billion over previous approaches.

Quantum computers promise exponential speedups for certain problems, with potential applications in areas from drug discovery to new materials for batteries. But is still in its early stages, so these are long-term goals. Nevertheless, there are exciting intermediate milestones on the journey to building a useful device. One currently receiving a lot of attention is “”, where a quantum computer performs a task beyond the capabilities of even the world’s most powerful supercomputers.

Experimental work from the University of Science and Technology of China (USTC) was the first to claim quantum advantage using photons—particles of light, in a protocol called “Gaussian Boson Sampling” (GBS). Their paper claimed that the experiment, performed in 200 seconds, would take 600 million years to simulate on the world’s largest supercomputer.