What if you could have a virtual doctor’s appointment from your living room—and they could feel the bump on your knee from thousands of miles away? With new research at the FAMU-FSU College of Engineering and Northwestern University, this might happen sooner than you think.
Researchers at the joint college are taking robotic touch to a new level using novel shape-based remote technology. Haptic technology uses robotics to create a sense of touch through vibrations, motion or other forces. It’s the ability to sense remote shapes that is novel in this research.
“We want to do for touch what the cell phone did for voice,” Carl Moore, associate professor in mechanical engineering, said. “When you talk on a cell phone, a person can be a world away, yet you can hear them as if they are next door. However, you can’t physically interact with the person when that person is not with you—but what if you could?”
Moore and Rodney Roberts, a professor of electrical and computer engineering at FAMU-FSU College of Engineering, are working with researchers J. Edward Colgate and Mathew Elwin at Northwestern University’s Center for Robotics and Biosystems in a new joint study funded by the National Science Foundation (NSF) National Robotics Initiative.
The team has designed a glove that gives a sense of touch to objects miles away. The Shape-Based Remote Manipulation (SBRM) glove is a novel multi-fingered haptic device that uses mathematical models to replicate objects grasped by a remote robot called an avatar. SBRM allows the user to feel remote objects virtually. To further the research, the team is creating a testbed between their physical research centers in Florida and Illinois.
Physically, the haptic glove and its attached robotic arm are in the Center for Intelligent, Systems, Control, and Robotics (CISCOR), at the FAMU-FSU College of Engineering campus in Tallahassee, Florida. This system interacts with an avatar robot at the Center for Robotics and Biosystems at Northwestern Univerity in Evanston, Illinois.
“Through an internet connection, a student, researcher or faculty member can put on a VR headset to see what the avatar sees,” Moore said. “Using the glove and robotic arm, if the person reaches out, the avatar reaches out. The user controls the avatar to do tasks, like pick up an object. If he picks up a cup, the robotic arm will simulate the cup’s weight while the glove simulates its shape.”
In the future, Moore envisions an avatar helper robot in any scenario where, separated by distance, virtual touch could enhance a diagnosis or assessment. For instance, current telemedicine options could be augmented by adding haptics to the video teleconference setup. Through a device similar to Moore’s glove, a doctor could add touch to their virtual diagnostic toolkit. In return, the doctor would have the sense of touching the knee, as if they were in the same room with the patient.
“Many people want to age in their homes, but they can’t because they need help, “Moore said. “We can envision a time when this will be an affordable technology just like other technology we have in our homes to help people.”
Moore is working on the mechanical design of the robotics system, and Roberts is doing the complicated math needed for system control.
“Dr. Roberts is a combination mathematician and roboticist,” Moore said. “He is developing equations that represent surfaces and is an expert in the math we are using.”
One of the issues that plague haptic telecommunication is lag time. Like the buffering during streaming video, a user might see the avatar contact an object before having a sense of touching it. However the research team has developed a fix.
“Generating the shapes and surfaces locally means we can eclipse lag time and give the user a better experience,” Moore said. “Some haptic devices calculate force to give a sense of touch to the user. It is much more difficult to prevent buffering instabilities with that method. Using shapes in haptic technology is novel and solves buffering issues that occur with other methods.”
The investigators are making an educational impact with the project by recruiting and training a majority BIPOC (Black, Indigenous, People of Color) group of graduate students for the research. At graduation, these students could represent more than ten percent of Black doctoral degrees in mechanical engineering conferred annually nationwide.
The team has an outreach project associated with the study to get more young people involved in STEM. They are working on a tele-touch exhibit with the Challenger Learning Center in Tallahassee, Florida, and the Museum of Science and Industry in Chicago, Illinois. The idea is still in early stages, but might allow students to interact by shaking hands from a distance or jointly building a Lego project.
“Since this is a new technology, how will students know they are interested in it if they have never seen it before?” Moore said. “We want students to be engaged, and some of them might get inspired to study this in the future or work with these types of robots.”
The $1.2 million research project is funded by two, four-year grants from the NSF-National Robotics Initiative, through Florida A&M University and Northwestern University.
Original source can be found here.