БЛОГ

Archive for the ‘supercomputing’ category: Page 32

Dec 1, 2022

[ML News] GPT-4 Rumors | AI Mind Reading | Neuron Interaction Solved | AI Theorem Proving

Posted by in categories: media & arts, robotics/AI, supercomputing

Your weekly news from the AI & Machine Learning world.

OUTLINE:
0:00 — Introduction.
0:25 — AI reads brain signals to predict what you’re thinking.
3:00 — Closed-form solution for neuron interactions.
4:15 — GPT-4 rumors.
6:50 — Cerebras supercomputer.
7:45 — Meta releases metagenomics atlas.
9:15 — AI advances in theorem proving.
10:40 — Better diffusion models with expert denoisers.
12:00 — BLOOMZ & mT0
13:05 — ICLR reviewers going mad.
21:40 — Scaling Transformer inference.
22:10 — Infinite nature flythrough generation.
23:55 — Blazing fast denoising.
24:45 — Large-scale AI training with MultiRay.
25:30 — arXiv to include Hugging Face spaces.
26:10 — Multilingual Diffusion.
26:30 — Music source separation.
26:50 — Multilingual CLIP
27:20 — Drug response prediction.
27:50 — Helpful Things.

Continue reading “[ML News] GPT-4 Rumors | AI Mind Reading | Neuron Interaction Solved | AI Theorem Proving” »

Nov 30, 2022

NASA uses a climate simulation supercomputer to better understand black hole jets

Posted by in categories: climatology, cosmology, evolution, particle physics, supercomputing

NASA’s Discover supercomputer simulated the extreme conditions of the distant cosmos.

A team of scientists from NASA’s Goddard Space Flight Center used the U.S. space agency’s Center for Climate Simulation (NCCS) Discover supercomputer to run 100 simulations of jets emerging from supermassive black holes.

Continue reading “NASA uses a climate simulation supercomputer to better understand black hole jets” »

Nov 28, 2022

Researchers publish 31,618 molecules with potential for energy storage in batteries

Posted by in categories: chemistry, information science, robotics/AI, supercomputing

Scientists from the Dutch Institute for Fundamental Energy Research (DIFFER) have created a database of 31,618 molecules that could potentially be used in future redox-flow batteries. These batteries hold great promise for energy storage. Among other things, the researchers used artificial intelligence and supercomputers to identify the molecules’ properties. Today, they publish their findings in the journal Scientific Data.

In recent years, chemists have designed hundreds of molecules that could potentially be useful in flow batteries for energy storage. It would be wonderful, researchers from DIFFER in Eindhoven (the Netherlands) imagined, if the properties of these molecules were quickly and easily accessible in a database. The problem, however, is that for many molecules the properties are not known. Examples of molecular properties are redox potential and water solubility. Those are important since they are related to the power generation capability and energy density of redox flow batteries.

To find out the still-unknown properties of molecules, the researchers performed four steps. First, they used a and smart algorithms to create thousands of virtual variants of two types of molecules. These molecule families, the quinones and aza aromatics, are good at reversibly accepting and donating electrons. That is important for batteries. The researchers fed the computer with backbone structures of 24 quinones and 28 aza-aromatics plus five different chemically relevant side groups. From that, the computer created 31,618 different molecules.

Nov 28, 2022

Fuel Ignition and Bottle Bubbles Snag Video Prize

Posted by in categories: energy, supercomputing

An annual APS video prize went to supercomputer simulations, control of chaotic Faraday waves, and studies of a large bubble in a bottle.

Nov 23, 2022

An optical chip that can train machine learning hardware

Posted by in categories: robotics/AI, supercomputing

A multi-institution research team has developed an optical chip that can train machine learning hardware. Their research is published today in Optica.

Machine learning applications have skyrocketed to $165 billion annually, according to a recent report from McKinsey. But before a machine can perform intelligence tasks such as recognizing the details of an image, it must be trained. Training of modern-day (AI) systems like Tesla’s autopilot costs several million dollars in electric power consumption and requires supercomputer-like infrastructure.

This surging AI “appetite” leaves an ever-widening gap between computer hardware and demand for AI. Photonic integrated circuits, or simply optical chips, have emerged as a possible solution to deliver higher computing performance, as measured by the number of operations performed per second per watt used, or TOPS/W. However, though they’ve demonstrated improved core operations in machine intelligence used for data classification, photonic chips have yet to improve the actual front-end learning and machine training process.

Nov 23, 2022

Artificial Intelligence & Robotics Tech News For October 2022

Posted by in categories: cyborgs, drones, Elon Musk, information science, quantum physics, robotics/AI, supercomputing, transhumanism, virtual reality

https://www.youtube.com/watch?v=QrXnYHubFPc

Deep Learning AI Specialization: https://imp.i384100.net/GET-STARTED
AI News Timestamps:
0:00 New AI Robot Dog Beats Human Soccer Skills.
2:34 Breakthrough Humanoid Robotics & AI Tech.
5:21 Google AI Makes HD Video From Text.
8:41 New OpenAI DALL-E Robotics.
11:31 Elon Musk Reveals Tesla Optimus AI Robot.
16:49 Machine Learning Driven Exoskeleton.
19:33 Google AI Makes Video Game Objects From Text.
22:12 Breakthrough Tesla AI Supercomputer.
25:32 Underwater Drone Humanoid Robot.
29:19 Breakthrough Google AI Edits Images With Text.
31:43 New Deep Learning Tech With Light waves.
34:50 Nvidia General Robot Manipulation AI
36:31 Quantum Computer Breakthrough.
38:00 In-Vitro Neural Network Plays Video Games.
39:56 Google DeepMind AI Discovers New Matrices Algorithms.
45:07 New Meta Text To Video AI
48:00 Bionic Tech Feels In Virtual Reality.
53:06 Quantum Physics AI
56:40 Soft Robotics Gripper Learns.
58:13 New Google NLP Powered Robotics.
59:48 Ionic Chips For AI Neural Networks.
1:02:43 Machine Learning Interprets Brain Waves & Reads Mind.

Nov 22, 2022

This AI Supercomputer Has 13.5 Million Cores—and Was Built in Just Three Days

Posted by in categories: robotics/AI, supercomputing

At the time, all this was theoretical. But last week, the company announced they’d linked 16 CS-2s together into a world-class AI supercomputer.

Meet Andromeda

The new machine, called Andromeda, has 13.5 million cores capable of speeds over an exaflop (one quintillion operations per second) at 16-bit half precision. Due to the unique chip at its core, Andromeda isn’t easily compared to supercomputers running on more traditional CPUs and GPUs, but Feldman told HPC Wire Andromeda is roughly equivalent to Argonne National Laboratory’s Polaris supercomputer, which ranks 17th fastest in the world, according to the latest Top500 list.

Nov 21, 2022

Microsoft and Nvidia partner to build AI supercomputer in the cloud

Posted by in categories: robotics/AI, supercomputing

Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.

A supercomputer, providing massive amounts of computing power to tackle complex challenges, is typically out of reach for the average enterprise data scientist. However, what if you could use cloud resources instead? That’s the rationale that Microsoft Azure and Nvidia are taking with this week’s announcement designed to coincide with the SC22 supercomputing conference.

Nvidia and Microsoft announced that they are building a “massive cloud AI computer.” The supercomputer in question, however, is not an individually-named system, like the Frontier system at the Oak Ridge National Laboratory or the Perlmutter system, which is the world’s fastest Artificial Intelligence (AI) supercomputer. Rather, the new AI supercomputer is a set of capabilities and services within Azure, powered by Nvidia technologies, for high performance computing (HPC) uses.

Nov 21, 2022

Les Ordinateurs Quantiques

Posted by in categories: quantum physics, supercomputing

Could energy efficiency be quantum computers’ greatest strength yet?

Quantum computers have attracted considerable interest of late for their potential to crack problems in a few hours where they might take the age of the universe (i.e., tens of billions of years) on the best supercomputers. Their real-life applications range from drug and materials design to solving complex optimization problems. They are, therefore, primarily intended for scientific and industrial research.

Continue reading “Les Ordinateurs Quantiques” »

Nov 19, 2022

Why This Breakthrough AI Now Runs A Nuclear Fusion Reactor | New AI Supercomputer

Posted by in categories: information science, nuclear energy, robotics/AI, supercomputing

Deep Learning AI Specialization: https://imp.i384100.net/GET-STARTED
Nuclear fusion researchers have created a machine learning AI algorithm to detect and track the existence of plasma blobs that build up inside the tokamak for prediction of plasma disruption, the diagnosis of plasma using spectroscopy and tomography, and the tracking of turbulence inside of the fusion reactor. New AI supercomputer with over 13.5 million processor cores and over 1 exaflop of compute power made be Cerebras. A new study reveals an innovative neuro-computational model of the human brain which could lead to the creation of conscious AI or artificial general intelligence (AGI).

AI News Timestamps:
0:00 Breakthrough AI Runs A Nuclear Fusion Reactor.
3:07 New AI Supercomputer.
6:19 New Brain Model For Conscious AI

#ai #ml #nuclear

Page 32 of 94First2930313233343536Last