ARLINGTON, Va. — The Army successfully tested its ability to redirect munitions in flight Aug. 28 in an experiment over the Mohave Desert involving an unmanned aircraft, smart sensors and artificial intelligence.
It was the “signature experiment for FY19” said Brig. Gen. Walter T. Rugen, director of the Future Vertical Lift Cross-Functional Team, speaking Thursday at the Association of the U.S. Army’s “Hot Topic” forum on aviation.
The experiment at Naval Air Weapons Station China Lake, California, tested a capability developed by his CFT called A3I, standing for Architecture, Automation, Autonomy and Interfaces.
Sept. 10 (UPI) — McDonald’s on Tuesday announced the acquisition of a company that will assist in automating its drive-through process.
The fast-food chain agreed to a deal to acquireApprente, a California-based company that was founded in 2017 with a focus on creating voice-based platforms for “complex, multilingual, multi-accent and multi-item conversational ordering.”
McDonald’s said Apprente’s technology will be used to allow for faster, simpler and more accurate order taking at its drive-throughs and may later be incorporated into mobile ordering and kiosks.
An unpiloted Japanese supply ship will launch to the International Space Station today (Sept. 10) and you can watch it leave Earth live courtesy of NASA and the Japan Aerospace Exploration Agency (JAXA).
The robotic spacecraft HTV-8 (also known as Kounotori8) will launch toward the space station from the Tanegashima Space Center in southern Japan at 5:33 p.m. EDT (2133 GMT). It will be 6:33 a.m. local time Wednesday at the launch site. You can watch the launch live here and on Space.com’s homepage via NASA TV at 5 p.m. EDT (2100 GMT). JAXA is offering its own webcast here beginning at 5:07 p.m. EDT (2107 GMT).
This would help understand the artificial neural networks or ANN which are AI models and programs that mimic the working of the human brain so that machines can learn to make decisions in a more human-like manner.
Researchers at the University of California, Los Angeles (UCLA) and the California NanoSystems Institute in Los Angeles have recently developed a soft swimming robot based on a self-sustained hydrogel oscillator. This robot, presented in a paper published in Science Robotics, operates under constant light input without the need for a battery.
“When I shone light on a soft, fast responsive hydrogel pillar, I observed the pillar started to oscillate around the optical beam,” Yusen Zhao, a Ph.D. student involved in the research, said. “It looked very intriguing to me, and I wondered: How can a constant input produce intermittent output? Under what conditions does the oscillation happen? Would it be powerful enough to propel and swim in water, and eventually lead to solar sails? With these questions, I continued systematic studies aiming to achieve these objectives.”
Zhao and his colleagues developed a soft oscillator made of a light-responsive soft gel, which is molded into the shape of a pillar or strip. When light hits a spot of this gel pillar, it is automatically absorbed and converted into heat. The locally heated spot on the robot causes it to eject some of its water and shrink in volume, resulting in its tail bending towards the light source.
Machine learning techniques have so far proved to be very promising for the analysis of data in several fields, with many potential applications. However, researchers have found that applying these methods to quantum physics problems is far more challenging due to the exponential complexity of many-body systems.
Once the realm of science fiction, voice-mimicking software is now “well within the range of any lay criminal who’s got creativity to spare,” one cybersecurity expert said.