БЛОГ

Archive for the ‘supercomputing’ category: Page 67

May 6, 2019

MIT Cryptographers Are No Match For A Determined Belgian

Posted by in categories: robotics/AI, supercomputing

Twenty years ago, a cryptographic puzzle was included in the construction of a building on the MIT campus. The structure that houses what is now MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) includes a time capsule designed by the building’s architect, [Frank Gehry]. It contains artifacts related to the history of computing, and was meant to be opened whenever someone solved a cryptographic puzzle, or after 35 years had elapsed.

The puzzle was not expected to be solved early, but [Bernard Fabrot], a developer in Belgium, has managed it using not a supercomputer but a run-of-the-mill Intel i7 processor. The capsule will be opened later in May.

The famous cryptographer, [Ronald Rivest], put together what we now know is a deceptively simple challenge. It involves a successive squaring operation, and since it is inherently sequential there is no possibility of using parallel computing techniques to take any shortcuts. [Fabrot] used the GNU Multiple Precision Arithmetic Library in his code, and took over 3 years of computing time to solve it. Meanwhile another team is using an FPGA and are expecting a solution in months, though have been pipped to the post by the Belgian.

Continue reading “MIT Cryptographers Are No Match For A Determined Belgian” »

Apr 25, 2019

A breakthrough in the study of laser/plasma interactions

Posted by in categories: biotech/medical, supercomputing

A new 3D particle-in-cell (PIC) simulation tool developed by researchers from Lawrence Berkeley National Laboratory and CEA Saclay is enabling cutting-edge simulations of laser/plasma coupling mechanisms that were previously out of reach of standard PIC codes used in plasma research. More detailed understanding of these mechanisms is critical to the development of ultra-compact particle accelerators and light sources that could solve long-standing challenges in medicine, industry, and fundamental science more efficiently and cost effectively.

In laser-plasma experiments such as those at the Berkeley Lab Laser Accelerator (BELLA) Center and at CEA Saclay—an international research facility in France that is part of the French Atomic Energy Commission—very large electric fields within plasmas that accelerate particle beams to over much shorter distances when compared to existing accelerator technologies. The long-term goal of these laser-plasma accelerators (LPAs) is to one day build colliders for high-energy research, but many spin offs are being developed already. For instance, LPAs can quickly deposit large amounts of energy into solid materials, creating dense plasmas and subjecting this matter to extreme temperatures and pressure. They also hold the potential for driving free-electron lasers that generate light pulses lasting just attoseconds. Such extremely short pulses could enable researchers to observe the interactions of molecules, atoms, and even subatomic particles on extremely short timescales.

Supercomputer simulations have become increasingly critical to this research, and Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC) has become an important resource in this effort. By giving researchers access to physical observables such as particle orbits and radiated fields that are hard to get in experiments at extremely small time and length scales, PIC simulations have played a major role in understanding, modeling, and guiding high-intensity physics experiments. But a lack of PIC codes that have enough computational accuracy to model laser-matter interaction at ultra-high intensities has hindered the development of novel particle and light sources produced by this interaction.

Continue reading “A breakthrough in the study of laser/plasma interactions” »

Apr 16, 2019

Optimizing network software to advance scientific discovery

Posted by in categories: mathematics, particle physics, supercomputing

High-performance computing (HPC)—the use of supercomputers and parallel processing techniques to solve large computational problems—is of great use in the scientific community. For example, scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory rely on HPC to analyze the data they collect at the large-scale experimental facilities on site and to model complex processes that would be too expensive or impossible to demonstrate experimentally.

Modern science applications, such as simulating , often require a combination of aggregated computing power, high-speed networks for data transfer, large amounts of memory, and high-capacity storage capabilities. Advances in HPC hardware and software are needed to meet these requirements. Computer and computational scientists and mathematicians in Brookhaven Lab’s Computational Science Initiative (CSI) are collaborating with physicists, biologists, and other domain scientists to understand their data analysis needs and provide solutions to accelerate the scientific discovery process.

Read more

Apr 15, 2019

Even more frightening than military AI: an AI President of the Republic?

Posted by in categories: government, military, robotics/AI, supercomputing

A recent survey by the IE University in Madrid reveals that one in four Europeans would be ready to put an artificial intelligence in power. Should we be concerned for democracy or, on the contrary, welcome Europeans’ confidence in technology?

Europeans ready to elect an AI?

According to the study in question, about one in four out of the 25,000 Europeans surveyed would be prepared to be governed by an AIt worth noting that there are significant variations between countries, because where the European average is around 30%, respondents in the Netherlands are much more open to having a government run by a supercomputer (+ 43%) than in France (+ 25%). “The idea of a pragmatic machine, impervious to fraud and corruption” is one of the reasons that seems most compelling to the interviewees. Added to this are the options that Machine Learning would enable: in fact, the AI described would be able to improve by studying and selecting the best political decisions in the world… It would then be able to make better decisions than existing politicians.

Continue reading “Even more frightening than military AI: an AI President of the Republic?” »

Apr 10, 2019

Human Brain/Cloud Interface

Posted by in categories: biotech/medical, education, internet, nanotechnology, Ray Kurzweil, robotics/AI, supercomputing

The Internet comprises a decentralized global system that serves humanity’s collective effort to generate, process, and store data, most of which is handled by the rapidly expanding cloud. A stable, secure, real-time system may allow for interfacing the cloud with the human brain. One promising strategy for enabling such a system, denoted here as a “human brain/cloud interface” (“B/CI”), would be based on technologies referred to here as “neuralnanorobotics.” Future neuralnanorobotics technologies are anticipated to facilitate accurate diagnoses and eventual cures for the ∼400 conditions that affect the human brain. Neuralnanorobotics may also enable a B/CI with controlled connectivity between neural activity and external data storage and processing, via the direct monitoring of the brain’s ∼86 × 10 neurons and ∼2 × 1014 synapses. Subsequent to navigating the human vasculature, three species of neuralnanorobots (endoneurobots, gliabots, and synaptobots) could traverse the blood–brain barrier (BBB), enter the brain parenchyma, ingress into individual human brain cells, and autoposition themselves at the axon initial segments of neurons (endoneurobots), within glial cells (gliabots), and in intimate proximity to synapses (synaptobots). They would then wirelessly transmit up to ∼6 × 1016 bits per second of synaptically processed and encoded human–brain electrical information via auxiliary nanorobotic fiber optics (30 cm) with the capacity to handle up to 1018 bits/sec and provide rapid data transfer to a cloud based supercomputer for real-time brain-state monitoring and data extraction. A neuralnanorobotically enabled human B/CI might serve as a personalized conduit, allowing persons to obtain direct, instantaneous access to virtually any facet of cumulative human knowledge. Other anticipated applications include myriad opportunities to improve education, intelligence, entertainment, traveling, and other interactive experiences. A specialized application might be the capacity to engage in fully immersive experiential/sensory experiences, including what is referred to here as “transparent shadowing” (TS). Through TS, individuals might experience episodic segments of the lives of other willing participants (locally or remote) to, hopefully, encourage and inspire improved understanding and tolerance among all members of the human family.

“We’ll have nanobots that… connect our neocortex to a synthetic neocortex in the cloud… Our thinking will be a… biological and non-biological hybrid.”

— Ray Kurzweil, TED 2014

Continue reading “Human Brain/Cloud Interface” »

Apr 5, 2019

Getting a big look at tiny particles

Posted by in categories: biotech/medical, nuclear energy, quantum physics, supercomputing

At the turn of the 20th century, scientists discovered that atoms were composed of smaller particles. They found that inside each atom, negatively charged electrons orbit a nucleus made of positively charged protons and neutral particles called neutrons. This discovery led to research into atomic nuclei and subatomic particles.

An understanding of these ’ structures provides crucial insights about the forces that hold matter together and enables researchers to apply this knowledge to other scientific problems. Although electrons have been relatively straightforward to study, protons and neutrons have proved more challenging. Protons are used in medical treatments, scattering experiments, and fusion energy, but nuclear scientists have struggled to precisely measure their underlying structure—until now.

In a recent paper, a team led by Constantia Alexandrou at the University of Cyprus modeled the location of one of the subatomic particles inside a , using only the basic theory of the strong interactions that hold matter together rather than assuming these particles would act as they had in experiments. The researchers employed the 27-petaflop Cray XK7 Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF) and a method called lattice quantum chromodynamics (QCD). The combination allowed them to map on a grid and calculate interactions with high accuracy and precision.

Read more

Mar 31, 2019

Supercomputers help supercharge protein assembly

Posted by in categories: biotech/medical, supercomputing

Red blood cells are amazing. They pick up oxygen from our lungs and carry it all over our body to keep us alive. The hemoglobin molecule in red blood cells transports oxygen by changing its shape in an all-or-nothing fashion. Four copies of the same protein in hemoglobin open and close like flower petals, structurally coupled to respond to each other. Using supercomputers, scientists are just starting to design proteins that self-assemble to combine and resemble life-giving molecules like hemoglobin. The scientists say their methods could be applied to useful technologies such as pharmaceutical targeting, artificial energy harvesting, ‘smart’ sensing and building materials, and more.

Read more

Mar 20, 2019

Supercomputer sheds light on how droplets merge

Posted by in categories: 3D printing, climatology, supercomputing

Scientists have revealed the precise molecular mechanisms that cause drops of liquid to combine, in a discovery that could have a range of applications.

Insights into how merge could help make 3D printing technologies more accurate and may help improve the forecasting of thunderstorms and other weather events, the study suggests.

Read more

Mar 20, 2019

A surprising, cascading earthquake

Posted by in categories: physics, supercomputing

The Kaikoura earthquake in New Zealand in 2016 caused widespread damage. LMU researchers have now dissected its mechanisms revealing surprising insights on earthquake physics with the aid of simulations carried out on the supercomputer SuperMUC.

The 2016 Kaikoura earthquake (magnitude 7.8) on the South Island of New Zealand is among the most intriguing and best-documented seismic events anywhere in the world – and one of the most complex. The earthquake exhibited a number of unusual features, and the underlying geophysical processes have since been the subject of controversy. LMU geophysicists Thomas Ulrich and Dr. Alice-Agnes Gabriel, in cooperation with researchers based at the Université Côte d’Azur in Valbonne and at Hong Kong Polytechnic University, have now simulated the course of the earthquake with an unprecedented degree of realism. Their model, which was run on the Bavarian Academy of Science’s supercomputer SuperMUC at the Leibniz Computing Center (LRZ) in Munich, elucidates dynamic reasons for such uncommon multi-segment earthquake. This is an important step towards improving the accuracy of earthquake hazard assessments in other parts of the world. Their findings appear in the online journal Nature Communications.

Continue reading “A surprising, cascading earthquake” »

Mar 14, 2019

Why modern enterprises need to adopt cognitive computing for faster business growth in a digital economy

Posted by in categories: business, economics, robotics/AI, supercomputing

Cognitive computing (CC) technology revolves around making computers adept at mimicking the processes of the human brain, which is basically making them more intelligent. Even though the phrase cognitive computing is used synonymously with AI, the term is closely associated with IBM’s cognitive computer system, Watson. IBM Watson is a supercomputer that leverages AI-based disruptive technologies like machine learning (ML), real-time analysis, natural language processing, etc. to augment decision making and deliver superior outcomes.

Read more

Page 67 of 94First6465666768697071Last