БЛОГ

Archive for the ‘information science’ category: Page 152

Nov 13, 2020

New study outlines steps higher education should take to prepare a new quantum workforce

Posted by in categories: education, employment, information science, quantum physics

A new study outlines ways colleges and universities can update their curricula to prepare the workforce for a new wave of quantum technology jobs. Three researchers, including Rochester Institute of Technology Associate Professor Ben Zwickl, suggested steps that need to be taken in a new paper in Physical Review Physics Education Research after interviewing managers at more than 20 quantum technology companies across the U.S.

The study’s authors from University of Colorado Boulder and RIT set out to better understand the types of entry-level positions that exist in these companies and the educational pathways that might lead into those jobs. They found that while the companies still seek employees with traditional STEM degrees, they want the candidates to have a grasp of fundamental concepts in quantum information science and technology.

“For a lot of those roles, there’s this idea of being ‘quantum aware’ that’s highly desirable,” said Zwickl, a member of RIT’s Future Photon Initiative and Center for Advancing STEM Teaching, Learning and Evaluation. “The companies told us that many positions don’t need to have deep expertise, but students could really benefit from a one- or two-semester introductory sequence that teaches the foundational concepts, some of the hardware implementations, how the algorithms work, what a qubit is, and things like that. Then a graduate can bring in all the strength of a traditional STEM degree but can speak the language that the is talking about.”

Nov 13, 2020

Google Brain Paper Demystifies Learned Optimizers

Posted by in categories: information science, robotics/AI

Learned optimizers are algorithms that can be trained to solve optimization problems. Although learned optimizers can outperform baseline optimizers in restricted settings, the ML research community understands remarkably little about their inner workings or why they work as well as they do. In a paper currently under review for ICLR 2021, a Google Brain research team attempts to shed some light on the matter.

The researchers explain that optimization algorithms can be considered the basis of modern machine learning. A popular research area in recent years has focused on learning optimization algorithms by directly parameterizing and training an optimizer on a distribution of tasks.

Research on learned optimizers aims to replace the baseline “hand-designed” optimizers with a parametric optimizer trained on a set of tasks, which can then be applied more generally. In contrast to baseline optimizers that use simple update rules derived from theoretical principles, learned optimizers use flexible, high-dimensional, nonlinear parameterizations.

Nov 13, 2020

Researchers make most precise measurements of deuterium fusing with a proton to form helium-3

Posted by in categories: cosmology, information science, physics

A large team of researchers affiliated with a host of institutions in Italy, the U.K and Hungary has carried out the most precise measurements yet of deuterium fusing with a proton to form helium-3. In their paper published in the journal Nature, the group describes their effort and how they believe it will contribute to better understanding the events that transpired during the first few minutes after the Big Bang.

Astrophysics theory suggests that the creation of deuterium was one of the first things that happened after the Big Bang. Therefore, it plays an important role in Big Bang nucleosynthesis—the reactions that happened afterward that led to the production of several of the light elements. Theorists have developed equations that show the likely series of events that occurred, but to date, it has been difficult to prove them correct without physical evidence. In this new effort, the researchers working at the Laboratory for Underground Nuclear Astrophysics in Italy have carried out experiments to simulate those first few minutes, hoping to confirm the theories.

The work was conducted deep under the thick rock cover of the Gran Sasso mountain to prevent interference from —it involved firing a beam of protons at a deuterium target—deuterium being a form of hydrogen with just one and one neutron—and then measuring the rate of fusion. But because the rate of fusion is so low, the bombardment had to be carried out many times—the team carried out their work nearly every weekend for three years.

Nov 12, 2020

New algorithm provides 50 times faster Deep Learning

Posted by in categories: information science, neuroscience, robotics/AI

Using algorithms derived from neuroscience, AI research company Numenta has achieved a dramatic performance improvement in deep learning networks, without any loss in accuracy. Their breakthrough is also vastly more energy efficient.

Nov 11, 2020

Scientists create a chemical space mapping method and crack the mystery of Mendeleev number

Posted by in categories: chemistry, information science, mapping, particle physics

Scientists have long sought a system for predicting the properties of materials based on their chemical composition. In particular, they set sights on the concept of a chemical space that places materials in a reference frame such that neighboring chemical elements and compounds plotted along its axes have similar properties. This idea was first proposed in 1984 by the British physicist, David G. Pettifor, who assigned a Mendeleev number (MN) to each element. Yet the meaning and origin of MNs were unclear. Scientists from the Skolkovo Institute of Science and Technology (Skoltech) puzzled out the physical meaning of the mysterious MNs and suggested calculating them based on the fundamental properties of atoms. They showed that both MNs and the chemical space built around them were more effective than empirical solutions proposed until then. Their research supported by a grant from the Russian Science Foundation’s (RSF) World-class Lab Research Presidential Program was presented in The Journal of Physical Chemistry C.

Systematizing the enormous variety of chemical , both known and hypothetical, and pinpointing those with a particularly interesting property is a tall order. Measuring the properties of all imaginable compounds in experiments or calculating them theoretically is downright impossible, which suggests that the search should be narrowed down to a smaller space.

David G. Pettifor put forward the idea of chemical space in the attempt to somehow organize the knowledge about material properties. The chemical space is basically a where elements are plotted along the axes in a certain sequence such that the neighboring elements, for instance, Na and K, have similar properties. The points within the space represent compounds, so that the neighbors, for example, NaCl and KCl, have similar properties, too. In this setting, one area is occupied by superhard materials and another by ultrasoft ones. Having the space at hand, one could create an algorithm for finding the best material among all possible compounds of all elements. To build their “smart” map, Skoltech scientists, Artem R. Oganov and Zahed Allahyari, came up with their own universal approach that boasts the highest predictive power as compared to the best-known methods.

Nov 11, 2020

DARPA Selects Teams to Further Advance Dogfighting Algorithms

Posted by in categories: information science, military, robotics/AI

DARPA recently awarded contracts to five companies to develop algorithms enabling mixed teams of manned and unmanned combat aircraft to conduct aerial dogfighting autonomously.

Boeing, EpiSci, Georgia Tech Research Institute, Heron Systems, and physicsAI were chosen to develop air combat maneuvering algorithms for individual and team tactical behaviors under Technical Area (TA) 1 of DARPA’s Air Combat Evolution (ACE) program. Each team is tasked with developing artificial intelligence agents that expand one-on-one engagements to two-on-one and two-on-two within-visual-range aerial battles. The companies’ algorithms will be tested in each of three program phases: modeling and simulation, sub-scale unmanned aircraft, and full-scale combat representative aircraft scheduled in 2023.

“The TA1 performers include a large defense contractor, a university research institute, and boutique AI firms, who will build upon the first-gen autonomous dogfighting algorithms demonstrated in the AlphaDogfight Trials this past August,” said Air Force Col. Dan “Animal” Javorsek, program manager in DARPA’s Strategic Technology Office. “We will be evaluating how well each performer is able to advance their algorithms to handle individual and team tactical aircraft behaviors, in addition to how well they are able to scale the capability from a local within-visual-range environment to the broader, more complex battlespace.”

Nov 11, 2020

Indianapolis Testing Advances Capabilities of Chemical, Biological Threat Detection Sensors

Posted by in categories: biological, chemistry, information science, transportation

DARPA’s SIGMA+ program conducted a week-long deployment of advanced chemical and biological sensing systems in the Indianapolis metro region in August, collecting more than 250 hours of daily life background atmospheric data across five neighborhoods that helped train algorithms to more accurately detect chemical and biological threats. The testing marked the first time in the program the advanced laboratory grade instruments for chemical and biological sensing were successfully deployed as mobile sensors, increasing their versatility on the SIGMA+ network.

“Spending a week gathering real-world background data from a major Midwestern metropolitan region was extremely valuable as we further develop our SIGMA+ sensors and networks to provide city and regional-scale coverage for chem and bio threat detection,” said Mark Wrobel, program manager in DARPA’s Defense Sciences Office. “Collecting chemical and biological environment data provided an enhanced understanding of the urban environment and is helping us make refinements of the threat-detection algorithms to minimize false positives and false negatives.”

SIGMA+ expands on the original SIGMA program’s advanced capability to detect illicit radioactive and nuclear materials by developing new sensors and networks that would alert authorities with high sensitivity to chemical, biological, and explosives threats as well. SIGMA, which began in 2014, has demonstrated city-scale capability for detecting radiological threats and is now operationally deployed with the Port Authority of New York and New Jersey, helping protect the greater New York City region.

Nov 11, 2020

Samsung develops a slim-panel holographic video display

Posted by in categories: information science, mathematics, mobile phones

A team of researchers at Samsung has developed a slim-panel holographic video display that allows for viewing from a variety of angles. In their paper published in the journal Nature Communications, the group describes their new display device and their plans for making it suitable for use with a smartphone.

Despite predictions in science-fiction books and movies over the past several decades, 3D holographic players are still not available to consumers. Existing players are too bulky and display video from limited viewing angles. In this new effort, the researchers at Samsung claim to have overcome these difficulties and built a demo device to prove it.

Continue reading “Samsung develops a slim-panel holographic video display” »

Nov 6, 2020

Applying particle physics methods to quantum computing

Posted by in categories: computing, information science, particle physics, quantum physics, space

Borrowing a page from high-energy physics and astronomy textbooks, a team of physicists and computer scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has successfully adapted and applied a common error-reduction technique to the field of quantum computing.

In the world of subatomic particles and giant particle detectors, and distant galaxies and giant telescopes, scientists have learned to live, and to work, with uncertainty. They are often trying to tease out ultra-rare particle interactions from a massive tangle of other particle interactions and background “noise” that can complicate their hunt, or trying to filter out the effects of atmospheric distortions and interstellar dust to improve the resolution of astronomical imaging.

Also, inherent problems with detectors, such as with their ability to record all particle interactions or to exactly measure particles’ energies, can result in data getting misread by the electronics they are connected to, so scientists need to design complex filters, in the form of computer algorithms, to reduce the margin of error and return the most accurate results.

Oct 30, 2020

AI has cracked a key mathematical puzzle for understanding our world

Posted by in categories: information science, mathematics, robotics/AI, transportation

Unless you’re a physicist or an engineer, there really isn’t much reason for you to know about partial differential equations. I know. After years of poring over them in undergrad while studying mechanical engineering, I’ve never used them since in the real world.

But partial differential equations, or PDEs, are also kind of magical. They’re a category of math equations that are really good at describing change over space and time, and thus very handy for describing the physical phenomena in our universe. They can be used to model everything from planetary orbits to plate tectonics to the air turbulence that disturbs a flight, which in turn allows us to do practical things like predict seismic activity and design safe planes.

The catch is PDEs are notoriously hard to solve. And here, the meaning of “solve” is perhaps best illustrated by an example. Say you are trying to simulate air turbulence to test a new plane design. There is a known PDE called Navier-Stokes that is used to describe the motion of any fluid. “Solving” Navier-Stokes allows you to take a snapshot of the air’s motion (a.k.a. wind conditions) at any point in time and model how it will continue to move, or how it was moving before.