Toggle light / dark theme

Stretchable display materials, which are gaining traction in the next-generation display market, have the advantage of being able to stretch and bend freely, but the limitations of existing materials have resulted in distorted screens and poor fit.

General elastomeric substrates are prone to screen due to the “Poisson’s ratio” phenomenon, in which stretching in one direction causes the screen to shrink in the vertical direction. In particular, electronics that are in close contact with the skin, such as , are at risk of wrinkling or pulling on the skin during stretching and shrinking, resulting in poor fit and performance.

A research team led by Dr. Jeong Gon Son of the Korea Institute of Science and Technology (KIST) and Professor Yongtaek Hong of Seoul National University have developed a nanostructure-aligned stretchable substrate that dramatically lowers the Poisson’s ratio. The work is published in the journal Advanced Materials.

This was first predicted by Omni magazine in 1981.


In the world of medicine, the ability to listen to the intricate symphony of sounds within the human body has long been a vital diagnostic tool. Physicians routinely employ stethoscopes to capture the subtle rhythms of air moving in and out of the lungs, the steady beat of the heart, and even the progress of digested food through the gastrointestinal tract.

These sounds hold valuable information about a person’s health, and any deviations from the norm can signal the presence of underlying medical issues. Now, a groundbreaking development from Northwestern University is set to transform the way we monitor these vital sounds.

In a breakthrough that could transform bioelectronic sensing, an interdisciplinary team of researchers at Rice University has developed a new method to dramatically enhance the sensitivity of enzymatic and microbial fuel cells using organic electrochemical transistors (OECTs). The research was recently published in the journal Device.

The innovative approach amplifies electrical signals by three orders of magnitude and improves signal-to-noise ratios, potentially enabling the next generation of highly sensitive, low-power biosensors for health and .

“We have demonstrated a simple yet powerful technique to amplify weak bioelectronic signals using OECTs, overcoming previous challenges in integrating fuel cells with electrochemical sensors,” said corresponding author Rafael Verduzco, professor of chemical and biomolecular engineering and materials science and nanoengineering. “This method opens the door to more versatile and efficient biosensors that could be applied in medicine, environmental monitoring and even wearable technology.”

DGIST research teams have developed a self-powered sensor that uses motion and pressure to generate electricity and light simultaneously. This battery-free technology is expected to be used in various real-life applications, such as disaster rescue, sports, and wearable devices.

Triboelectric nanogenerators (TENG) and mechanoluminescence (ML) have attracted attention as green energy technologies that can generate electricity and light, respectively, without external power. However, researchers in previous studies mainly focused on the two technologies separately or simply combined them. Moreover, the power output stability of TENG and the insufficient luminous duration of ML materials have been major limitations for practical applications.

The research team has developed a system that generates electricity and light simultaneously using motion and pressure. They added light-emitting zinc sulfide-copper (ZnS: Cu) particles to a rubber-like material (polydimethylsiloxane [PDMS]) and designed a single electrode structure based on silver nanowires to obtain high efficiency. The developed device does not degrade in performance even after being repeatedly pressed more than 5,000 times, and it stably generates voltages of up to 60 V and a current of 395 nA.

Plastic that conducts electricity might sound impossible. But there is a special class of materials known as “electronic polymers” that combines the flexibility of plastic with the functionality of metal. This type of material opens the door for breakthroughs in wearable devices, printable electronics and advanced energy storage systems.

Yet, making thin films from electronic polymers has always been a difficult task. It takes a lot of fine-tuning to achieve the right balance of physical and . Researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory have created an innovative solution to this challenge with artificial intelligence (AI).

They used an AI-driven, automated materials laboratory, a tool called Polybot, to explore processing methods and produce high-quality films. Polybot is located at the Center for Nanoscale Materials, a DOE Office of Science user facility at Argonne.

🛟🧬🧫🩹⚙️


Wound infections are common combat injuries and can take otherwise able-bodied personnel out of operations and/or result in severe medical complications. Current standard of care relies on complicated and often time-consuming tests to identify the specific infection-inducing pathogens that caused the wound infection. Therapeutic treatments rely on broad-spectrum and high-dose antibiotics alongside surgical excision – which are not pathogen specific, drive antibiotic resistance, can have toxic side effects, require advanced medical training, and can result in high treatment costs and burden on patients. A game-changing approach to managing infection of combat wounds, particularly one that can be applied autonomously, would benefit warfighter readiness and resilience.

The BioElectronics to Sense and Treat (BEST) program seeks to meet this need by developing wearable, automated technologies that can predict and prevent a wound infection before it can occur, and to eliminate an infection if it has already taken hold. To achieve this, DARPA is seeking researchers to develop novel bioelectronic smart bandages comprised of wound infection sensor and treatment modules. The sensors should be high-resolution and provide real-time, continual monitoring of wounds based on, for example, the person’s immune state and the collection of bacteria that live in and around a wound. Data from these sensors will be used to predict if a wound will fail to heal due to infection, diagnose the infection, and regulate administration of targeted treatments – using closed-loop control to prevent or resolve infection for improved wound healing.

“Given that infection initiates at the time of injury and can take hold before aid arrives, particularly in austere environments, the earlier we can deploy these technologies, the bigger impact they will have,” noted Dr. Leonard Tender, BEST program manager. “Even if medivac occurs immediately, without the ability to prevent infection, the downstream care required to treat the surge of wound infections resulting from a large-scale combat operation could easily overwhelm care capacity.”

Meta Platforms is assembling a specialized team within its Reality Labs division, led by Marc Whitten, to develop the AI, sensors, and software that could power the next wave of humanoid robots.

S platform capabilities. + s social media platforms. We believe expanding our portfolio to invest in this field will only accrue value to Meta AI and our mixed and augmented reality programs, Bosworth said. + How is Meta planning to advance its robotics work?

S CTO Andrew Bosworth. Bloomberg News reported the hiring first. + Meta has also appointed John Koryl as vice president of retail. Koryl, the former CEO of second-hand e-commerce platform The RealReal, will focus on boosting direct sales of Meta’s Quest mixed reality headsets and AI wearables, including Ray-Ban Meta smart glasses, developed in partnership with EssilorLuxottica.

S initial play is to become the backbone of the industry similar to what Google The company has already started talks with robotics firms like Unitree Robotics and Figure AI. With plans to hire 100 engineers this year and billions committed to AI and AR/VR, Meta is placing a major bet on humanoid robots as the next leap in smart home technology.


Researchers have achieved a breakthrough in wearable health technology by developing a novel self-healing electronic skin (E-Skin) that repairs itself in seconds after damage. This could potentially transform the landscape of personal health monitoring.

In a study published in Science Advances, scientists demonstrate an unprecedented advancement in E-Skin technology that recovers over 80% of its functionality within 10 seconds of being damaged—a dramatic improvement over existing technologies that can take minutes or hours to heal.

The technology seamlessly combines ultra-rapid self-healing capabilities, reliable performance in , advanced artificial intelligence integration, and highly accurate health monitoring systems. This integration enables real-time fatigue detection and muscle strength assessment with remarkable precision.

More than 15 million people worldwide are living with spinal cord injury (SCI), which can affect their sensory and motor functions below the injury level. For individuals with SCI between C5 and C7 cervical levels, this can mean paralysis affecting their limbs and limited voluntary finger and wrist flexion, making it difficult to grasp large, heavy objects.

Now, a team of UC Berkeley engineers from the Embodied Dexterity Group has developed a to enhance grasping functionality in this population. Dubbed the Dorsal Grasper, this leverages voluntary wrist extension and uses supernumerary robotic fingers on the back of the hand to facilitate human-robot collaborative grasping.

In a study recently featured in IEEE Transactions on Neural Systems and Rehabilitation Engineering, the researchers demonstrated for the first time how the Dorsal Grasper can expand users’ graspable workspace. Test subjects found that they could easily grasp objects anywhere they could reach their arm, without having to rotate their bodies, which can cause wheelchair users to lose their balance.