БЛОГ

Archive for the ‘information science’ category: Page 214

Jun 13, 2020

Israeli researchers explain how they are healing the world with precision

Posted by in categories: biotech/medical, computing, health, information science

Data governs our lives more than ever. But when it comes to disease and death, every data point is a person, someone who became sick and needed treatment.

Recent studies have revealed that people suffering from the same disease category may have different manifestations. As doctors and scientists better understand the reasons underlying this variability, they can develop novel preventive, diagnostic and therapeutic approaches and provide optimal, personalized care for every patient.

To accomplish this goal often requires broadscale collaborations between physicians, basic researchers, theoreticians, experimentalists, computational biologists, computer scientists and data scientists, engineers, statisticians, epidemiologists and others. They must work together to integrate scientific and medical knowledge, theory, analysis of medical big data and extensive experimental work.

Continue reading “Israeli researchers explain how they are healing the world with precision” »

Jun 12, 2020

Computer algorithms find tumors’ molecular weak spots

Posted by in categories: biotech/medical, computing, information science

Approach to identifying the best drug targets gets critical test.

Jun 11, 2020

Engineers offer smart, timely ideas for AI bottlenecks

Posted by in categories: information science, robotics/AI, transportation

Rice University researchers have demonstrated methods for both designing innovative data-centric computing hardware and co-designing hardware with machine-learning algorithms that together can improve energy efficiency by as much as two orders of magnitude.

Advances in machine learning, the form of artificial intelligence behind self-driving cars and many other high-tech applications, have ushered in a new era of computing—the data-centric era—and are forcing engineers to rethink aspects of computing architecture that have gone mostly unchallenged for 75 years.

“The problem is that for large-scale deep neural networks, which are state-of-the-art for machine learning today, more than 90% of the electricity needed to run the entire system is consumed in moving data between the and processor,” said Yingyan Lin, an assistant professor of electrical and .

Jun 9, 2020

Rapid de novo assembly of the European eel genome from nanopore sequencing reads

Posted by in categories: biotech/medical, computing, information science

Circa 2017


We have sequenced the genome of the endangered European eel using the MinION by Oxford Nanopore, and assembled these data using a novel algorithm specifically designed for large eukaryotic genomes. For this 860 Mbp genome, the entire computational process takes two days on a single CPU. The resulting genome assembly significantly improves on a previous draft based on short reads only, both in terms of contiguity (N50 1.2 Mbp) and structural quality. This combination of affordable nanopore sequencing and light weight assembly promises to make high-quality genomic resources accessible for many non-model plants and animals.

Jun 7, 2020

Physicists create quantum-inspired optical sensor

Posted by in categories: biological, information science, quantum physics, space

Researchers from the Moscow Institute of Physics and Technology, joined by a colleague from Argonne National Laboratory, U.S., have implemented an advanced quantum algorithm for measuring physical quantities using simple optical tools. Published in Scientific Reports, their study takes us a step closer to affordable linear optics-based sensors with high performance characteristics. Such tools are sought after in diverse research fields, from astronomy to biology.

Maximizing the sensitivity of measurement tools is crucial for any field of science and technology. Astronomers seek to detect remote cosmic phenomena, biologists need to discern exceedingly tiny organic structures, and engineers have to measure the positions and velocities of objects, to name a few examples.

Until recently, no measurement could ensure precision above the so-called shot noise limit, which has to do with the statistical features inherent in classical observations. Quantum technology has provided a way around this, boosting precision to the fundamental Heisenberg limit, stemming from the basic principles of quantum mechanics. The LIGO experiment, which detected for the first time in 2016, shows it is possible to achieve Heisenberg-limited sensitivity by combining complex optical interference schemes and quantum techniques.

Jun 7, 2020

Quantum Dots Shift Sunlight’s Spectrum to Speed Plant Growth

Posted by in categories: information science, quantum physics

In the consumer electronics industry, quantum dots are used to dramatically improve color reproduction in TV displays. That’s because LCD TV displays, the kind in most of our living rooms, require a backlight. This light is typically made up of white, or white-ish LEDs. The LCD filters the white light into red, green, and blue pixels; their combinations create the colors that appear on the screen.

Before quantum dots, filtering meant that much of the light didn’t make it to the screen. Putting a layer of quantum dots between the LEDs and the LCD, however, changes that equation. QD TVs use blue LEDs as the light source, then take advantage of the quantum effect to shift some of that light to tightly constrained red and green wavelengths. Because only this purified light reaches the filters—instead of the full spectrum that makes up white light—far less is blocked and wasted.

It turns out that this same approach to making your TV picture better can make plants grow faster, because plants, like LCD filters, are tuned to certain colors of light.

Jun 6, 2020

Locus Robotics raises another $40M as retailers increasingly look to automate

Posted by in categories: biotech/medical, business, information science, robotics/AI

The COVID-19 pandemic will have a profound impact on robotics, as more companies look to automation as a way forward. While wide-scale automation had long seemed like an inevitability, the pandemic is set to accelerate the push as corporations look for processes that remove the human element from the equation.

Of course, Locus Robotics hasn’t had too much of an issue raising money previously. The Massachusetts-based startup, which raised $26 million back in April of last year, is adding a $40 million Series D to its funds. That brings the full amount to north of $105 million. This latest round, led by Zebra Technologies, comes as the company looks to expand operations with the launch of a European HQ.

“The new funding allows Locus to accelerate expansion into global markets,” CEO Rick Faulk said in a release, “enabling us to strengthen our support of retail, industrial, healthcare, and 3PL businesses around the world as they navigate through the COVID-19 pandemic, ensuring that they come out stronger on the other side.”

Jun 2, 2020

Research finds some AI advances are over-hyped

Posted by in categories: information science, robotics/AI

Is it possible some instances of artificial intelligence are not as intelligent as we thought?

Call it artificial artificial intelligence.

A team of computer graduate students reports that a closer examination of several dozen information retrieval algorithms hailed as milestones in artificial research were in fact nowhere near as revolutionary as claimed. In fact, AI used in those algorithms were often merely minor tweaks of previously established routines.

Jun 2, 2020

AI System – Using Neural Networks With Deep Learning – Beats Stock Market in Simulation

Posted by in categories: finance, information science, robotics/AI

Researchers in Italy have melded the emerging science of convolutional neural networks (CNNs) with deep learning — a discipline within artificial intelligence — to achieve a system of market forecasting with the potential for greater gains and fewer losses than previous attempts to use AI methods to manage stock portfolios. The team, led by Prof. Silvio Barra at the University of Cagliari, published their findings on IEEE/CAA Journal of Automatica Sinica.

The University of Cagliari-based team set out to create an AI-managed “buy and hold” (B&H) strategy — a system of deciding whether to take one of three possible actions — a long action (buying a stock and selling it before the market closes), a short action (selling a stock, then buying it back before the market closes), and a hold (deciding not to invest in a stock that day). At the heart of their proposed system is an automated cycle of analyzing layered images generated from current and past market data. Older B&H systems based their decisions on machine learning, a discipline that leans heavily on predictions based on past performance.

By letting their proposed network analyze current data layered over past data, they are taking market forecasting a step further, allowing for a type of learning that more closely mirrors the intuition of a seasoned investor rather than a robot. Their proposed network can adjust its buy/sell thresholds based on what is happening both in the present moment and the past. Taking into account present-day factors increases the yield over both random guessing and trading algorithms not capable of real-time learning.

Jun 2, 2020

Carnegie Mellon tool automatically turns math into pictures

Posted by in categories: information science, mathematics

Some people look at an equation and see a bunch of numbers and symbols; others see beauty. Thanks to a new tool created at Carnegie Mellon University, anyone can now translate the abstractions of mathematics into beautiful and instructive illustrations.