Toggle light / dark theme

Year 2021 face_with_colon_three


Communication between brain activity and computers, known as brain-computer interface or BCI, has been used in clinical trials to monitor epilepsy and other brain disorders. BCI has also shown promise as a technology to enable a user to move a prosthesis simply by neural commands. Tapping into the basic BCI concept would make smart phones smarter than ever.

Research has zeroed in on retrofitting wireless earbuds to detect neural signals. The data would then be transmitted to a smartphone via Bluetooth. Software at the smartphone end would translate different brain wave patterns into commands. The emerging technology is called Ear EEG.

Rikky Muller, Assistant Professor of Electrical Engineering and Computer Science, has refined the physical comfort of EEG earbuds and has demonstrated their ability to detect and record brain activity. With support from the Bakar Fellowship Program, she is building out several applications to establish Ear EEG as a new platform technology to support consumer and health monitoring apps.

A new study by Brown University researchers suggests that gold nanoparticles—microscopic bits of gold thousands of times thinner than a human hair—might one day be used to help restore vision in people with macular degeneration and other retinal disorders.

In a study published in the journal ACS Nano, the research team showed that nanoparticles injected into the retina can successfully stimulate the visual system and restore vision in mice with retinal disorders. The findings suggest that a new type of visual prosthesis system in which nanoparticles, used in combination with a small laser device worn in a pair of glasses or goggles, might one day help people with retinal disorders to see again.

“This is a new type of retinal prosthesis that has the potential to restore vision lost to without requiring any kind of complicated surgery or ,” said Jiarui Nie, a postdoctoral researcher at the National Institutes of Health who led the research while completing her Ph.D. at Brown. “We believe this technique could potentially transform treatment paradigms for retinal degenerative conditions.”

The skin is the largest organ in the human body. It makes up around 15 percent of our body weight and protects us from pathogens, dehydration and temperature extremes. Skin diseases are therefore more than just unpleasant – they can quickly become dangerous for affected patients. Although conditions such as skin cancer, chronic wounds and autoimmune skin diseases are widespread, we often still don’t fully understand about why they develop and how we can treat them effectively.

To find answers to these questions, Empa researchers are working together with clinical partners on a model of human skin. The model will allow scientists to simulate skin diseases and thus better understand them. This is not a computer or plastic model. Rather, researchers from Empa’s Laboratory for Biomimetic Membranes and Textiles and its Laboratory for Biointerfaces aim to produce a living “artificial skin” that contains cells and emulates the layered and wrinkled structure of human skin. The project is part of the Swiss research initiative SKINTEGRITY.CH.

In order to recreate something as complex as skin, suitable building materials are needed. This is where Empa researchers have recently made progress: They have developed a hydrogel that meets the complex requirements while being easy to manufacture. The basis: gelatin from the skin of cold-water fish.

Scientists Just Merged Human Brain Cells With AI – Here’s What Happened!
What happens when human brain cells merge with artificial intelligence? Scientists have just achieved something straight out of science fiction—combining living neurons with AI to create a hybrid intelligence system. The results are mind-blowing, and they could redefine the future of computing. But how does it work, and what does this mean for humanity?

In a groundbreaking experiment, researchers successfully integrated human brain cells with AI, creating a system that learns faster and more efficiently than traditional silicon-based computers. These “biocomputers” use lab-grown brain organoids to process information, mimicking human thought patterns while leveraging AI’s speed and scalability. The implications? Smarter, more adaptive machines that think like us.

Why is this such a big deal? Unlike conventional AI, which relies on brute-force data crunching, this hybrid system operates more like a biological brain—learning with less energy, recognizing patterns intuitively, and even showing early signs of creativity. Potential applications include ultra-fast medical diagnostics, self-improving robots, and brain-controlled prosthetics that feel truly natural.

But with great power comes big questions. Could this lead to conscious machines? Will AI eventually surpass human intelligence? And what are the ethical risks of blending biology with technology? This video breaks down the science, the possibilities, and the controversies—watch to the end for the full story.

How did scientists merge brain cells with AI? What are biocomputers? Can AI become human-like? What is hybrid intelligence? Will AI replace human brains?This video will answer all these question. Make sure you watch all the way though to not miss anything.

#ai.

How does a robotic arm or a prosthetic hand learn a complex task like grasping and rotating a ball? The challenge for the human, prosthetic or robotic hand has always been to correctly learn to control the fingers to exert forces on an object.

The and nerve endings that cover our hands have been attributed with helping us learn and adapt to our manipulation, so roboticists have insisted on incorporating sensors into robotic hands. But–given that you can still learn to handle objects with gloves on– there must be something else at play.

This mystery is what inspired researchers in the ValeroLab in the Viterbi School of Engineering to explore if tactile sensation is really always necessary for learning to control the fingers.

Naturalistic communication is an aim for neuroprostheses. Here the authors present a neuroprosthesis that restores the voice of a paralyzed person simultaneously with their speaking attempts, enabling naturalistic communication.

Imagine navigating a virtual reality with contact lenses or operating your smartphone underwater: This and more could soon be a reality thanks to innovative e-skins.

A research team led by the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) has developed an that detects and precisely tracks magnetic fields with a single global sensor. This artificial skin is not only light, transparent and permeable, but also mimics the interactions of real skin and the brain, as the team reports in the journal Nature Communications.

Originally developed for robotics, e-skins imitate the properties of real skin. They can give robots a or replace lost senses in humans. Some can even detect chemical substances or magnetic fields. But the technology also has its limits. Highly functional e-skins are often impractical because they rely on extensive electronics and large batteries.

Try brilliant FREE for 30 days: https://brilliant.org/ihm/
And get 20% off an annual membership!

Can you implant lab-grown brain tissue to heal brain damage? Kind of. What if you also implant an electrical stimulation device? The next generation of brain implants may be the Organoid Brain-Computer Interface (OBCI).

Learn about: brain organoids, dendritic spines, synapses, presynaptic and postsynaptic neurons, neurotransmitters.

Story of Einstein’s Brain: https://www.npr.org/2005/04/18/4602913/the-long-strange-jour…eins-brain