БЛОГ

Archive for the ‘information science’ category: Page 188

Oct 11, 2019

Biologically-inspired skin improves robots’ sensory abilities

Posted by in categories: cyborgs, information science, robotics/AI

Sensitive synthetic skin enables robots to sense their own bodies and surroundings—a crucial capability if they are to be in close contact with people. Inspired by human skin, a team at the Technical University of Munich (TUM) has developed a system combining artificial skin with control algorithms and used it to create the first autonomous humanoid robot with full-body artificial skin.

The developed by Prof. Gordon Cheng and his team consists of hexagonal about the size of a two-euro coin (i.e. about one inch in diameter). Each is equipped with a microprocessor and sensors to detect contact, acceleration, proximity and temperature. Such artificial enables robots to perceive their surroundings in much greater detail and with more sensitivity. This not only helps them to move safely. It also makes them safer when operating near people and gives them the ability to anticipate and actively avoid accidents.

Continue reading “Biologically-inspired skin improves robots’ sensory abilities” »

Oct 9, 2019

Bio-Mimetic Real-Time Cortex Project — Whole Brain Emulation — Dr. Alice Parker — University of Southern California — ideaXme — Ira Pastor

Posted by in categories: big data, bioengineering, complex systems, driverless cars, drones, electronics, engineering, information science, neuroscience, robotics/AI

Oct 8, 2019

An AI Pioneer Wants His Algorithms to Understand the ‘Why’

Posted by in categories: information science, robotics/AI

Deep learning is good at finding patterns in reams of data, but can’t explain how they’re connected. Turing Award winner Yoshua Bengio wants to change that.

Oct 8, 2019

Are Black Holes Made of Dark Energy? Error Made When Applying Einstein’s Equations to Model Growth of the Universe?

Posted by in categories: cosmology, information science, physics

Two University of Hawaiʻi at Mānoa researchers have identified and corrected a subtle error that was made when applying Einstein’s equations to model the growth of the universe.

Physicists usually assume that a cosmologically large system, such as the universe, is insensitive to details of the small systems contained within it. Kevin Croker, a postdoctoral research fellow in the Department of Physics and Astronomy, and Joel Weiner, a faculty member in the Department of Mathematics, have shown that this assumption can fail for the compact objects that remain after the collapse and explosion of very large stars.

“For 80 years, we’ve generally operated under the assumption that the universe, in broad strokes, was not affected by the particular details of any small region,” said Croker. “It is now clear that general relativity can observably connect collapsed stars—regions the size of Honolulu—to the behavior of the universe as a whole, over a thousand billion billion times larger.”

Oct 5, 2019

How Will We Store Three Septillion Bits of Data? Your Metabolome May Have the Answer

Posted by in categories: biological, computing, information science, neuroscience

For the “big data” revolution to continue, we need to radically rethink our hard drives. Thanks to evolution, we already have a clue.

Our bodies are jam-packed with data, tightly compacted inside microscopic structures within every cell. Take DNA: with just four letters we’re able to generate every single molecular process that keeps us running. That sort of combinatorial complexity is still unheard of in silicon-based data storage in computer chips.

Add this to the fact that DNA can be dehydrated and kept intact for eons—500,000 years and counting—and it’s no surprise that scientists have been exploiting its properties to encode information. To famed synthetic biologist Dr. George Church, looking to biology is a no-brainer: even the simple bacteria E. Coli has a data storage density of 1019 bits per cubic centimeter. Translation? Just a single cube of DNA measuring one meter each side can meet all of the world’s current data storage needs.

Oct 3, 2019

Machine learning predicts behavior of biological circuits

Posted by in categories: biotech/medical, information science, robotics/AI

Biomedical engineers at Duke University have devised a machine learning approach to modeling the interactions between complex variables in engineered bacteria that would otherwise be too cumbersome to predict. Their algorithms are generalizable to many kinds of biological systems.

In the new study, the researchers trained a neural network to predict the circular patterns that would be created by a biological circuit embedded into a bacterial culture. The system worked 30,000 times faster than the existing computational .

To further improve accuracy, the team devised a method for retraining the machine learning model multiple times to compare their answers. Then they used it to solve a second biological system that is computationally demanding in a different way, showing the algorithm can work for disparate challenges.

Oct 2, 2019

Even the AI Behind Deepfakes Can’t Save Us From Being Duped

Posted by in categories: information science, robotics/AI

Google and Facebook are releasing troves of deepfakes to teach algorithms how to detect them. But the human eye will be needed for a long time.

Oct 1, 2019

Quantum Superposition Record: 2000 Atoms in Two Places at Once

Posted by in categories: information science, particle physics, quantum physics

The quantum superposition principle has been tested on a scale as never before in a new study by scientists at the University of Vienna in collaboration with the University of Basel. Hot, complex molecules composed of nearly two thousand atoms were brought into a quantum superposition and made to interfere. By confirming this phenomenon – “the heart of quantum mechanics”, in Richard Feynman’s words – on a new mass scale, improved constraints on alternative theories to quantum mechanics have been placed. The work was published in Nature Physics on September 23, 2019.

Quantum to classical?

The superposition principle is a hallmark of quantum theory which emerges from one of the most fundamental equations of quantum mechanics, the Schrödinger equation. It describes particles in the framework of wave functions, which, much like water waves on the surface of a pond, can exhibit interference effects. But in contrast to water waves, which are a collective behavior of many interacting water molecules, quantum waves can also be associated with isolated single particles.

Oct 1, 2019

Non-abelian Aharonov-Bohm experiment done at long last

Posted by in categories: computing, information science, particle physics, quantum physics

For the first time, physicists in the US have confirmed a decades-old theory regarding the breaking of time-reversal symmetry in gauge fields. Marin Soljacic at the Massachusetts Institute of Technology and an international team of researchers have made this first demonstration of the “non-Abelian Aharonov-Bohm effect” in two optics experiments. With improvements, their techniques could find use in optoelectronics and fault-tolerant quantum computers.

First emerging in Maxwell’s famous equations for classical electrodynamics, a gauge theory is a description of the physics of fields. Gauge theories have since become an important part of physicists’ descriptions of the dynamics of elementary particles – notably the theory of quantum electrodynamics.

A salient feature of a gauge theory is that the physics it describes does not change when certain transformations are made to the underlying equations describing the system. An example is the addition of a constant scalar potential or a “curl-free” vector potential to Maxwell’s equations. Mathematically, this does not change the electric and magnetic fields that act on a charged particle such as an electron – and therefore the behaviour of the electron – so Maxwell’s theory is gauge invariant.

Oct 1, 2019

Rapture of the nerds: will the Singularity turn us into gods or end the human race?

Posted by in categories: biotech/medical, finance, information science, mathematics, robotics/AI, singularity

Circa 2012


Hundreds of the world’s brightest minds — engineers from Google and IBM, hedge funds quants, and Defense Department contractors building artificial intelligence — were gathered in rapt attention inside the auditorium of the San Francisco Masonic Temple atop Nob Hill. It was the first day of the seventh annual Singularity Summit, and Julia Galef, the President of the Center for Applied Rationality, was speaking onstage. On the screen behind her, Galef projected a giant image from the film Blade Runner: the replicant Roy, naked, his face stained with blood, cradling a white dove in his arms.

At this point in the movie, Roy is reaching the end of his short, pre-programmed life, “The poignancy of his death scene comes from the contrast between that bitter truth and the fact that he still feels his life has meaning, and for lack of a better word, he has a soul,” said Galef. “To me this is the situation we as humans have found ourselves in over the last century. Turns out we are survival machines created by ancient replicators, DNA, to produce as many copies of them as possible. This is the bitter pill that science has offered us in response to our questions about where we came from and what it all means.”

Continue reading “Rapture of the nerds: will the Singularity turn us into gods or end the human race?” »