Toggle light / dark theme

In the future, many computers will most likely be based on electronic circuits made of superconductors. These are materials through which an electrical current can flow without energy losses, could be very promising for the development of high-performance supercomputers and quantum computers.

Researchers at University of California Santa Barbara, Raytheon BBN Technologies, University of Cagliari, Microsoft Research, and the Tokyo Institute of Technology have recently developed a magneto-optic modulator—a device that control the properties of a light beam through a . This device, introduced in a paper published in Nature Electronics, could contribute to the implementation of large-scale electronics and computers based on superconductors.

“We are working on a new technology that can speed up high-performance supercomputers and quantum computers based on superconductor technology,” Paolo Pintus, the researcher who led the study, told TechXplore. “Superconductors work properly only at low temperatures, generally just above absolute zero (−273.15° Celsius). Because of this, circuits made of these materials must be kept inside a dedicated refrigerator.”

Insights into how minute, yet powerful, bubbles form and collapse on underwater surfaces could help make industrial structures such as ship propellers more hardwearing, research suggests.

Supercomputer calculations have revealed details of the growth of so-called nanobubbles, which are tens of thousands of times smaller than a pin head.

The findings could lend valuable insight into damage caused on industrial structures, such as pump components, when these bubbles burst to release tiny but powerful jets of liquid.

The founders all believed that the traditional method of building a quantum computer of a useful size would take too long. At the company’s inception, the PsiQuantum team established its goal to build a million qubit, fault-tolerant photonic quantum computer. They also believed the only way to create such a machine was to manufacture it in a semiconductor foundry.

Early alerts

PsiQuantum first popped up on my quantum radar about two years ago when it received $150 million in Series C funding which upped total investments in the company to $215 million.

China Launches World’s Fastest Quantum Computers | China’s Advancement In Quantum Computers #technology.

“Techno Jungles”

In 2019, Google announced that its 53-qubit Sycamore processor had finished a task in 3.3 minutes that would have taken a conventional supercomputer at least 2.5 days to accomplish. According to reports, China’s 66-Qubit Zuchongzhi 2 Quantum Processor was able to complete the same task 1 million times faster in October of last year. Together with the Shanghai Institute of Technical Physics and the Shanghai Institute of Microsystem and Information Technology, a group of researchers from the Chinese Academy of Sciences Center for Excellence in Quantum Information and Quantum Physics were responsible for the development of that processor.

According to NDTV, the Chinese government under Xi Jinping has spent $10 billion on the country’s National Laboratory for Quantum Information Sciences. This demonstrates China’s significant commitment to the field of quantum computing. According to Live Science, the nation is also a world leader in the field of quantum networking, which involves the transmission of data that has been encoded through the use of quantum mechanics over great distances.

The most powerful Exascale Supercomputer is going to release in 2021 and will feature a total of 64 Exaflops. More than 6 times as much, as the Leonardo Supercomputer that’s also set to release this year.
This is accomplished with the help of a new type of processor technology from Tachyum that’s called “Prodigy” and is described as the first Universal Processor.

This new processor is set to enable General Artificial Intelligence at the speed of the human brain in real-time. It’s many times faster than the fastest intel xeon, nvidia graphics card or apple silicon. This new super-computer will enable previously-thought impossible simulations of the brain, medicine and more.

If you enjoyed this video, please consider rating this video and subscribing to our channel for more frequent uploads. Thank you! smile

#supercomputer #ai #exascale

Article originally published on LINKtoLEADERS under the Portuguese title “Sem saber ler nem escrever!”

In the 80s, “with no knowledge, only intuition”, I discovered the world of computing. I believed computers could do everything, as if it were an electronic God. But when I asked the TIMEX Sinclair 1000 to draw the planet Saturn — I am fascinated by this planet, maybe because it has rings —, I only glimpse a strange message on the black and white TV:

A team of Japanese and US physicists has pushed thousands of Ytterbium atoms to just within a billionth of a degree above absolute zero to understand how matter behaves at these extreme temperatures. The approach treats the atoms as fermions, the type of particles like electrons and protons, that cannot end up in the so-called fifth state of matter at those extreme temperatures: a Bose-Einstein Condensate.

When fermions are actually cooled down, they do exhibit quantum properties in a way that we can’t simulate even with the most powerful supercomputer. These extremely cold atoms are placed in a lattice and they simulate a “Hubbard model” which is used to study the magnetic and superconductive behavior of materials, in particular the collective motion of electrons through them.

The symmetry of these models is known as the special unitary group, or, SU, and depends on the possible spin state. In the case of Ytterbium, that number is 6. Calculating the behavior of just 12 particles in a SU Hubbard model can’t be done with computers. However, as reported in Nature Physics, the team used laser cooling to reduce the temperature of 300,000 atoms to a value almost three billion times colder than the temperature of outer space.

Creating images from text in seconds—and doing so with a conventional graphics card and without supercomputers? As fanciful as it may sound, this is made possible by the new Stable Diffusion AI model. The underlying algorithm was developed by the Machine Vision & Learning Group led by Prof. Björn Ommer (LMU Munich).

“Even for laypeople not blessed with artistic talent and without special computing know-how and , the new model is an effective tool that enables computers to generate images on command. As such, the model removes a barrier to expressing their creativity,” says Ommer. But there are benefits for seasoned artists as well, who can use Stable Diffusion to quickly convert new ideas into a variety of graphic drafts. The researchers are convinced that such AI-based tools will be able to expand the possibilities of creative image generation with paintbrush and Photoshop as fundamentally as computer-based word processing revolutionized writing with pens and typewriters.

In their project, the LMU scientists had the support of the start-up Stability. Ai, on whose servers the AI model was trained. “This additional computing power and the extra training examples turned our AI model into one of the most powerful image synthesis algorithms,” says the computer scientist.

The human brain is an amazing computing machine. Weighing only three pounds or so, it can process information a thousand times faster than the fastest supercomputer, store a thousand times more information than a powerful laptop, and do it all using no more energy than a 20-watt lightbulb.

Researchers are trying to replicate this success using soft, flexible organic materials that can operate like biological neurons and someday might even be able to interconnect with them. Eventually, soft “neuromorphic” computer chips could be implanted directly into the brain, allowing people to control an artificial arm or a computer monitor simply by thinking about it.

Like real neurons — but unlike conventional computer chips — these new devices can send and receive both chemical and electrical signals. “Your brain works with chemicals, with neurotransmitters like dopamine and serotonin. Our materials are able to interact electrochemically with them,” says Alberto Salleo, a materials scientist at Stanford University who wrote about the potential for organic neuromorphic devices in the 2021 Annual Review of Materials Research.

A breakthrough low-memory technique by Rice University computer scientists could put one of the most resource-intensive forms of artificial intelligence—deep-learning recommendation models (DLRM)—within reach of small companies.

DLRM recommendation systems are a popular form of AI that learns to make suggestions users will find relevant. But with top-of-the-line training models requiring more than a hundred terabytes of memory and supercomputer-scale processing, they’ve only been available to a short list of technology giants with deep pockets.

Rice’s “random offset block embedding ,” or ROBE Array, could change that. It’s an algorithmic approach for slashing the size of DLRM memory structures called embedding tables, and it will be presented this week at the Conference on Machine Learning and Systems (MLSys 2022) in Santa Clara, California, where it earned Outstanding Paper honors.