Toggle light / dark theme

It gives new meaning to the phrase “speak your mind.

Do you remember how legendary cosmologist Stephen Hawking communicated using his special screen-equipped chair? Well, that was a brain-computer interface (BCI), a device that allows a person to communicate using their brain signals.

There are approximately 70 million people across the globe who suffer from speech-related disorders. What if there was a BCI for each one of these patients that could at least spell out words, if not speak for them? A team of researchers from the University of California, San Fransico (UCSF) works on one such groundbreaking device.

They have created a neuroprosthesis (a type of BCI device that re-establish lost functions of the nervous system) that analyzes the brain activity of a user with speech paralysis. The device then translates the brain signals into single letters and spells sentences on a screen. Reading the sentences lets anyone know what the user wants to say.

More affordable than the regular ones.

The Arm2u biomedical engineering team from the Barcelona School of Industrial Engineering (ETSEIB) of the Universitat Politècnica de Catalunya designed and constructed a configurable transradial prosthesis that responds to the user’s nerve impulses using 3D printing technology.

Arm2u is a prosthesis that can replace a missing arm below the elbow. It can be controlled with myoelectric control, which means that it is controlled by the natural electrical signals produced by muscle contraction.


UPC

As stated in the release, UPC bachelor’s and master’s degree students started off improving a prosthesis for people with disabilities using assistive technologies.

Biomedical and electrical engineers at UNSW Sydney have developed a new way to measure neural activity using light—rather than electricity—which could lead to a complete reimagining of medical technologies like nerve-operated prosthetics and brain-machine interfaces.

Professor François Ladouceur, with UNSW’s School of Electrical Engineering and Telecommunications, says the multi-disciplinary team has just demonstrated in the lab what it proved theoretically shortly before the pandemic: that sensors built using liquid crystal and integrated optics technologies—dubbed “optrodes”—can register nerve impulses in a living animal body.

Not only do these optrodes perform just as well as conventional electrodes—that use electricity to detect a nerve impulse—but they also address “very thorny issues that competing technologies cannot address,” says Prof. Ladouceur.

He turned a crisis into an opportunity.

Brian Stanley is a living human cyborg. He has gone viral after sharing a video on social media with an eye flashlight that can light up the whole room. After losing one eye to cancer.

As Brian Stanley suggested in the video, the eye has a battery life of roughly 20 hours, and “it does not get hot.”


Brian Stanley/Instagram.

After losing one eye to cancer, Southern California-based engineer and prototype machinist, Stanley decided to install a flashlight into his eyehole and called it “Titanium Skull Lamp.”

A titanium robotic exoskeleton is helping an eight-year-old boy in Mexico learn to walk after being wheelchair-bound for most of his life.

The boy, David, suffers from cerebral palsy, a group of neurological disorders that surfaces during early childhood and hinders a child’s ability to control their muscle movements. In effect, it makes it extremely difficult for an affected child to walk and maintain their balance and posture.

As you can imagine, rehabilitating a child with cerebral palsy is a long and arduous process. But now, David’s speeding up his rehabilitation with the help of the battery-powered Atlas 2030 exoskeleton, developed by award winning Spanish roboticist Elena García Armada.

“This exoskeleton personalizes assistance as people walk normally through the real world,” said Steve Collins, associate professor of mechanical engineering who leads the Stanford Biomechatronics Laboratory, in a press release. “And it resulted in exceptional improvements in walking speed and energy economy.”

The personalization is enabled by a machine learning algorithm, which the team trained using emulators—that is, machines that collected data on motion and energy expenditure from volunteers who were hooked up to them. The volunteers walked at varying speeds under imagined scenarios, like trying to catch a bus or taking a stroll through a park.

The algorithm drew connections between these scenarios and peoples’ energy expenditure, applying the connections to learn in real time how to help wearers walk in a way that’s actually useful to them. When a new person puts on the boot, the algorithm tests a different pattern of assistance each time they walk, measuring how their movements change in response. There’s a short learning curve, but on average the algorithm was able to effectively tailor itself to new users in just an hour.

Inspired by insects, robotic engineers are creating machines that could aid in search and rescue, pollinate plants and sniff out gas leaks.

Cyborg cockroaches that find earthquake survivors. A “robofly” that sniffs out gas leaks. Flying lightning bugs that pollinate farms in space.

These aren’t just buzzy ideas, they’re becoming reality.

Robotic engineers are scouring the insect world for inspiration. Some are strapping 3D-printed sensors onto live Madagascar hissing cockroaches, while others are creating fully robotic bugs inspired by the ways insects move and fly.


Robotic engineers are scouring the insect world for inspiration, and creating machines that could be used for emergency response, farming and energy.