Toggle light / dark theme

Circa 2018


SPIDERS often make people jump but a bunch of clever scientists have managed to train one to jump on demand.

Researchers managed to teach the spider – nicknamed Kim – to jump from different heights and distances so they could film the arachnid’s super-springy movements.

The study is part of a research programme by the University of Manchester which aims to create a new class of micro-robots agile enough to jump like acrobatic spiders.

November 2019 is a landmark month in the history of the future. That’s when humanoid robots that are indistinguishable from people start running amok in Los Angeles. Well, at least they do in the seminal sci-fi film “Blade Runner.” Thirty-seven years after its release, we don’t have murderous androids running around. But we do have androids like Hanson Robotics’ Sophia, and they could soon start working in jobs traditionally performed by people.

Russian start-up Promobot recently unveiled what it calls the world’s first autonomous android. It closely resembles a real person and can serve in a business capacity. Robo-C can be made to look like anyone, so it’s like an android clone. It comes with an artificial intelligence system that has more than 100,000 speech modules, according to the company. It can operate at home, acting as a companion robot and reading out the news or managing smart appliances — basically, an anthropomorphic smart speaker. It can also perform workplace tasks such as answering customer questions in places like offices, airports, banks and museums, while accepting payments and performing other functions.

“We analyzed the needs of our customers, and there was a demand,” says Promobot co-founder and development director Oleg Kivokurtsev. “But, of course, we started the development of an anthropomorphic robot a long time ago, since in robotics there is the concept of the ‘Uncanny Valley,’ and the most positive perception of the robot arises when it looks like a person. Now we have more than 10 orders from companies and private clients from around the world.”

From the understated opulence of a Bentley to the stalwart family minivan to the utilitarian pickup, Americans know that the car you drive is an outward statement of personality. You are what you drive, as the saying goes, and researchers at Stanford have just taken that maxim to a new level.

Using computer algorithms that can see and learn, they have analyzed millions of publicly available images on Google Street View. The researchers say they can use that knowledge to determine the political leanings of a given neighborhood just by looking at the cars on the streets.

“Using easily obtainable visual data, we can learn so much about our communities, on par with some information that takes billions of dollars to obtain via census surveys. More importantly, this research opens up more possibilities of virtually continuous study of our society using sometimes cheaply available visual data,” said Fei-Fei Li, an associate professor of computer science at Stanford and director of the Stanford Artificial Intelligence Lab and the Stanford Vision Lab, where the work was done.

The company rolled out the first of them for the U.S. on Thursday, a plug-in rechargeable Wrangler to go on sale in America, Europe and China early next year.

The Wrangler 4xe can go 25 miles (40 kilometers) on electricity before a 2-liter turbocharged four-cylinder engine takes over. Drivers can choose to have an engine-powered generator recharge the batteries (at a higher fuel consumption rate), although it would take about 2.5 hours at 45 to 55 mph (72.4 to 88.5 kilometers per hour) to fully replenish them.

A big driver of the new offerings is FCA’s obligation to meet fuel economy and pollution regulations in Europe, China, and the U.S. or face stiff fines or steep costs to buy electric vehicle credits from companies like Tesla.

A pioneer in Emotion AI, Rana el Kaliouby, Ph.D., is on a mission to humanize technology before it dehumanizes us.

At LiveWorx 2020, Rana joined us to share insights from years of research and collaboration with MIT’s Advanced Vehicle Technology group.

Part demo and part presentation, Rana breaks down the facial patterns that cameras can pick up from a tired or rested driver, and observations from the first ever large-scale study looking at driver behavior over time.

Learn how these inferences can be used to change the driving experience ➡️ https://archive.liveworx.com/sessions/artificial-emotional-i…it-matters


Today’s devices work hand-in-hand with humans –at work, home, school and play. Dr. Rana el Kaliouby believes they can do much more. An expert in artificial emotional intelligence, or “Emotion AI,” Dr. el Kaliouby explores the valuable applications of humanized technology in media and advertising, gaming, automotive, robotics, health, education and more. She explains how machine learning works, explores the potential for the development of emotion chips, and addresses the ethics and privacy issues of Emotion AI. In her talks, Dr. el Kaliouby gives participants an inside look at the world’s largest emotion data repositoryresults from her research studying more than 5 million faces around the world, and reveals that the emoji mindset may soon be a thing of the past.