БЛОГ

Archive for the ‘supercomputing’ category: Page 5

Aug 5, 2024

Will neuromorphic computers accelerate AGI development?

Posted by in categories: biological, robotics/AI, supercomputing

Neuromorphic computers are devices that try to achieve reasoning capability by emulating a human brain. They are a different type of computer architecture that copies the physical characteristics and design principles of biological nervous systems. Although neuromorphic computations can be emulated, it’s very inefficient for classical computers to simulate. Typically new hardware is required.

The first neuromorphic computer at the scale of a full human brain is about to come online. It’s called DeepSouth, and will be finished in April 2024 at Western Sydney University. This computer should enable new research into how our brain actually functions, potentially leading to breakthroughs in how AI is created.

Continue reading “Will neuromorphic computers accelerate AGI development?” »

Aug 3, 2024

NVIDIA Accelerating the Future of AI & Humanoid Robots

Posted by in categories: economics, robotics/AI, singularity, supercomputing, virtual reality

And this shows one of the many ways in which the Economic Singularity is rushing at us. The 🦾🤖 Bots are coming soon to a job near you.


NVIDIA unveiled a suite of services, models, and computing platforms designed to accelerate the development of humanoid robots globally. Key highlights include:

Continue reading “NVIDIA Accelerating the Future of AI & Humanoid Robots” »

Jul 30, 2024

AI brain images create realistic synthetic data to use in medical research

Posted by in categories: biotech/medical, information science, robotics/AI, supercomputing

An AI model developed by scientists at King’s College London, in close collaboration with University College London, has produced three-dimensional, synthetic images of the human brain that are realistic and accurate enough to use in medical research.

The model and images have helped scientists better understand what the human brain looks like, supporting research to predict, diagnose and treat such as dementia, stroke, and multiple sclerosis.

The algorithm was created using the NVIDIA Cambridge-1, the UK’s most powerful supercomputer. One of the fastest supercomputers in the world, the Cambridge-1 allowed researchers to train the AI in weeks rather than months and produce images of far higher quality.

Jul 23, 2024

A hybrid supercomputer: Researchers integrate a quantum computer into a high-performance computing environment

Posted by in categories: chemistry, energy, quantum physics, supercomputing

Working together, the University of Innsbruck and the spin-off AQT have integrated a quantum computer into a high-performance computing (HPC) environment for the first time in Austria. This hybrid infrastructure of supercomputer and quantum computer can now be used to solve complex problems in various fields such as chemistry, materials science or optimization.

Demand for computing power is constantly increasing and the consumption of resources to support these calculations is growing. Processor clock speeds in conventional computers, typically a few GHz, appear to have reached their limit.

Performance improvements over the last 10 years have focused primarily on the parallelization of tasks using multi-core systems, which are operated in HPC centers as fast networked multi-node computing clusters. However, computing power only increases approximately linearly with the number of nodes.

Jul 23, 2024

Diving into Organoid Intelligence

Posted by in categories: biotech/medical, health, robotics/AI, supercomputing

The field of organoid intelligence is recognized as groundbreaking. In this field, scientists utilize human brain cells to enhance computer functionality. They cultivate tissues in laboratories that mimic real organs, particularly the brain. These brain organoids can perform brain-like functions and are being developed by Dr. Thomas Hartung and his team at the Johns Hopkins Bloomberg School of Public Health.

For nearly two decades, scientists have used organoids to conduct experiments without harming humans or animals. Hartung, who has been cultivating brain organoids from human skin samples since 2012, aims to integrate these organoids into computing. This approach promises more energy-efficient computing than current supercomputers and could revolutionize drug testing, improve our understanding of the human brain, and push the boundaries of computing technology.

The conducted research highlights the potential of biocomputing to surpass the limitations of traditional computing and AI. Despite AI’s advancements, it still falls short of replicating the human brain’s capabilities, such as energy efficiency, learning, and complex decision-making. The human brain’s capacity for information storage and energy efficiency remains unparalleled by modern computers. Hartung’s work with brain organoids, inspired by Nobel Prize-winning stem cell research, aims to replicate cognitive functions in the lab. This research could open new avenues for understanding the human brain by allowing ethical experimentation. The team envisions scaling up the size of brain organoids and developing communication tools for input and output, enabling more complex tasks.

Jul 13, 2024

Simulating the universe’s most extreme environments

Posted by in categories: particle physics, quantum physics, supercomputing

The Standard Model of Particle Physics encapsulates nearly everything we know about the tiny quantum-scale particles that make up our everyday world. It is a remarkable achievement, but it’s also incomplete — rife with unanswered questions. To fill the gaps in our knowledge, and discover new laws of physics beyond the Standard Model, we must study the exotic phenomena and states of matter that don’t exist in our everyday world. These include the high-energy collisions of particles and nuclei that take place in the fiery heart of stars, in cosmic ray events occurring all across earth’s upper atmosphere, and in particle accelerators like the Large Hadron Collider (LHC) at CERN or the Relativistic Heavy Ion Collider at Brookhaven National Laboratory.

Computer simulations of fundamental physics processes play an essential role in this research, but many important questions require simulations that are much too complex for even the most powerful classical supercomputers. Now that utility-scale quantum computers have demonstrated the ability to simulate quantum systems at a scale beyond exact or “brute force” classical methods, researchers are exploring how these devices might help us run simulations and answer scientific questions that are inaccessible to classical computation. In two recent papers published in PRX Quantum (PRX)1 and Physical Review D (PRD)2, our research group did just that, developing scalable techniques for simulating the real-time dynamics of quantum-scale particles using the IBM® fleet of utility-scale, superconducting quantum computers.

The techniques we’ve developed could very well serve as the building blocks for future quantum computer simulations that are completely inaccessible to both exact and even approximate classical methods — simulations that would demonstrate what we call “quantum advantage” over all known classical techniques. Our results provide clear evidence that such simulations are potentially within reach of the quantum hardware we have today.

Jul 12, 2024

China: Quantum tech cracks subatomic code, beats supercomputers

Posted by in categories: energy, quantum physics, supercomputing

A Chinese research team has achieved a significant milestone in quantum computing by successfully building a device that can simulate the movement of electrons within a solid-state material.

This research, published in the journal Nature, showcases the potential of quantum computers to surpass even the most powerful supercomputers.

Understanding electron behavior is crucial for scientific advancements, particularly in the fields of magnetism and high-temperature superconducting materials. These materials could revolutionize electricity transmission and transportation, leading to significant energy savings and technological progress.

Jul 11, 2024

$457M from National Science Foundation to help establish new computing center at UT focused on AI

Posted by in categories: robotics/AI, science, supercomputing

Tons of money have been set aside to build a new AI supercomputer lab here.

Jul 9, 2024

Defense Innovation Unit project makes supercomputers more accessible

Posted by in categories: innovation, supercomputing

Two commercial firms demonstrated that they could provide high-performance computing tools on the cloud.

Jul 9, 2024

NASA’s Roman Mission Gets Cosmic ‘Sneak Peek’ From Supercomputers

Posted by in categories: space, supercomputing

Researchers used supercomputers to create nearly 4 million simulated images depicting the cosmos.

Researchers are diving into a synthetic universe to help us better understand the real one. Using supercomputers at the U.S. DOE’s (Department of Energy’s) Argonne National Laboratory in Illinois, scientists have created nearly 4 million simulated images depicting the cosmos as NASA’s Nancy Grace Roman Space Telescope and the Vera C. Rubin Observatory, jointly funded by NSF (the National Science Foundation) and DOE, in Chile will see it.

Michael Troxel, an associate professor of physics at Duke University in Durham, North Carolina, led the simulation campaign as part of a broader project called OpenUniverse. The team is now releasing a 10-terabyte subset of this data, with the remaining 390 terabytes to follow this fall once they’ve been processed.

Page 5 of 96First23456789Last