Toggle light / dark theme

Tesla is gearing up to build its next-generation Dojo supercomputer at its Gigafactory in Buffalo, New York, as part of a $500 million investment announced by the state’s governor on Friday.

The Dojo supercomputer is designed to process massive amounts of data from Tesla’s vehicles and train its artificial intelligence (AI) systems for autonomous driving and other applications. It is expected to be one of the most powerful computing clusters in the world, surpassing the current leader, NVIDIA.

Cryptocurrency is usually “mined” through the blockchain by asking a computer to perform a complicated mathematical problem in exchange for tokens of cryptocurrency. But in research appearing in the journal Chem a team of chemists has repurposed this process, asking computers to instead generate the largest network ever created of chemical reactions which may have given rise to prebiotic molecules on early Earth.

This work indicates that at least some primitive forms of metabolism might have emerged without the involvement of enzymes, and it shows the potential to use blockchain to solve problems outside the financial sector that would otherwise require the use of expensive, hard to access supercomputers.

“At this point we can say we exhaustively looked for every possible combination of chemical reactivity that scientists believe to had been operative on primitive Earth,” says senior author Bartosz A. Grzybowski of the Korea Institute for Basic Science and the Polish Academy of Sciences.

Give people a barrier, and at some point they are bound to smash through. Chuck Yeager broke the sound barrier in 1947. Yuri Gagarin burst into orbit for the first manned spaceflight in 1961. The Human Genome Project finished cracking the genetic code in 2003. And we can add one more barrier to humanity’s trophy case: the exascale barrier.

The exascale barrier represents the challenge of achieving exascale-level computing, which has long been considered the benchmark for high performance. To reach that level, however, a computer needs to perform a quintillion calculations per second. You can think of a quintillion as a million trillion, a billion billion, or a million million millions. Whichever you choose, it’s an incomprehensibly large number of calculations.

On May 27, 2022, Frontier, a supercomputer built by the Department of Energy’s Oak Ridge National Laboratory, managed the feat. It performed 1.1 quintillion calculations per second to become the fastest computer in the world.

New advancements in technology frequently necessitate the development of novel materials – and thanks to supercomputers and advanced simulations, researchers can bypass the time-consuming and often inefficient process of trial-and-error.

The Materials Project, an open-access database founded at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) in 2011, computes the properties of both known and predicted materials. Researchers can focus on promising materials for future technologies – think lighter alloys that improve fuel economy in cars, more efficient solar cells to boost renewable energy, or faster transistors for the next generation of computers.

Artificial intelligence has progressed from sci-fi fantasy to mainstream reality. AI now powers online tools from search engines to voice assistants and it is used in everything from medical imaging analysis to autonomous vehicles. But the advance of AI will soon collide with another pressing issue: energy consumption.

Much like cryptocurrencies today, AI risks becoming a target for criticism and regulation based on its high electricity appetite. Partisans are forming into camps, with AI optimists extolling continued progress through more compute power, while pessimists are beginning to portray AI power usage as wasteful and even dangerous. Attacks echo those leveled at crypto mining in recent years. Undoubtedly, there will be further efforts to choke off AI innovation by cutting its energy supply.

The pessimists raise some valid points. Developing ever-more capable AI does require vast computing resources. For example, the amount of compute used to train OpenAI’s ChatGPT-3 reportedly equaled 800 petaflops of processing power—on par with the 20 most powerful supercomputers in the world combined. Similarly, ChatGPT receives somewhere on the order of hundreds of millions of queries each day. Estimates suggest that the electricity required to respond to all these queries might be around 1 GWh daily, enough to power the daily energy consumption of about 33,000 U.S. households. Demand is expected to further increase in the future.

In a significant breakthrough, Microsoft and the Pacific Northwest National Laboratory have utilised artificial intelligence and supercomputing to discover a new material that could dramatically reduce lithium use in batteries by up to 70%. This discovery, potentially revolutionising the battery industry, was achieved by narrowing down from 32 million inorganic materials to 18 candidates in just a week, a process that could have taken over 20 years traditionally.

#microsoft #ai #gravitas.

About Channel:

WION The World is One News examines global issues with in-depth analysis. We provide much more than the news of the day. Our aim is to empower people to explore their world. With our Global headquarters in New Delhi, we bring you news on the hour, by the hour. We deliver information that is not biased. We are journalists who are neutral to the core and non-partisan when it comes to world politics. People are tired of biased reportage and we stand for a globalized united world. So for us, the World is truly One.

Lawrence Berkeley National Lab researchers use computational methods to describe an approach for optimizing the LK99 material as a superconductor.

Some will say, hey why is Nextbigfuture still covering LK99. Didn’t some angry scientists say that LK99 was not a superconductor? I have been covering science for over 20 years and there are a lot of angry scientists who believe many things will not work. Scientists going into experiments looking to debunk something will not be the ones who figure out how to make it work.

Lawrence Berkeley National Lab researchers spent time and worked on supercomputers to try to figure out how to make LK99 work. There computational work is showing promise.

A new biohybrid computer combining a “brain organoid” and a traditional AI was able to perform a speech recognition task with 78% accuracy — demonstrating the potential for human biology to one day boost our computing capabilities.

The background: The human brain is the most energy efficient “computer” on Earth — while a supercomputer needs 20 mega watts of power to process more than a quintillion calculations per second, your brain can do the equivalent with just 20 watts (a megawatt is 1 million watts).

This has given researchers the idea to try boosting computers by combining them with a three-dimensional clump of lab-grown human brain cells, known as a brain organoid.