Wearable technology is capable of tracking various measures of human health and is getting better all the time. New research shows how this could come to mean real-time feedback on posture and body mechanics. A research team at Cornell University has demonstrated this functionality in a novel camera system for the wrist, which it hopes to work into smartwatches of the future.
The system is dubbed BodyTrak and comes from the same lab behind a face-tracking wearable we looked at earlier in the year that is able to recreate facial expressions on a digital avatar through sonar. This time around, the group made use of a tiny dime-sized RGB camera and a customized AI to construct models of the entire body.
The camera is worn on the wrist and relays basic images of body parts in motion to a deep neural network, which had been trained to turn these snippets into virtual recreations of the body. This works in real time and fills in the blanks left by the camera’s images to construct 3D models of the body in 14 different poses.
Comments are closed.