While looking for new research on robotics last week, I caught the news from Facebook — I mean, Meta — about its new flexible sensor. It unveiled what it calls ReSkin technology for tactile sensing, created in conjunction with Carnegie Mellon University researchers. I think this technology could have a lot of potential in the smart home.
ReSkin is essentially a low-cost ($6 per 100) flexible sensor meant to mimic the sense of touch from human skin. Because it can measure tactile pressure, non-human hands such as those of robots can pick up fragile objects without crushing them. Here’s a video containing several ReSkin demonstrations, including one of a robotic gripper picking up grapes without turning them into pulpy juice.
This isn’t the first time we’ve seen tactile technology. Earlier this year, the MIT Computer Science and Artificial Laboratory (CSAIL) showed off similar elastic sensors for robotic arms. Where ReSkin differs, however, is in how it gets data to its sensors.
Movement can cause tactile sensors to wear out over time. To prevent that from happening, Meta’s ReSkin uses pressure to separate the tactile sensing material from the chip that reads and interprets the data.
Typically, a sensor requires some type of physical electrical connection between the two, which makes the touch-sensitive material difficult to replace. But since ReSkin uses magnetic signals to pass the “skin’s” data to a chip, replacing the tactile skin is easily done.
“Because ReSkin is a deformable elastomer with embedded magnetic particles, when it deforms in any way, the surrounding magnetic signal changes. We can measure these changes with nearby magnetometers and use data-driven techniques to translate this data into information such as contact location and amount of applied force.”
And because ReSkin can deform, much like skin, it doesn’t just provide pressure data for a single spot; it can display multiple pressure points of varying levels.
So what the heck does ReSkin have to do with the smart home?
At the moment, nothing. For now, ReSkin is just a research project. However, all of the related design, relevant documentation, code, and base models for machine learning will be released to the public. There’s already an open source GitHub repository to use ReSkin. That opens up a world of possibilities that up until now haven’t been available in the smart home.
Here’s one example. At some point almost every day, I sit in the same chair, either to work or to kick back and watch TV on a mobile device. If I take my seat at nighttime, I have to ask one digital assistant or another to switch on the lamp next to my chair. Sure, I could use a motion sensor. Or I could rig up some very basic pressure sensors and make my own “smart furniture” so the light goes on when I sit down. (I’ve tried the latter, but it’s not a bulletproof solution due to the lack of precision.)
Now imagine what would happen if there were ReSkin sensors in the seat cushion instead. Not only would the chair know that someone has sat in it, but the amount of weight on the chair would signify who was sitting in it. That, in turn, would offer context as to what should happen next with the lights — I like daylight bulb colors, for example, while my wife prefers soft white. So in this case, the ability to identify who is sitting in the chair would enable a more personal smart home experience.
In this particular case, there would also be a potential wellness implication. Let’s say the ReSkin sensors are in both the chair’s seat cushion and the backrest. The chair would be able to prevent certain back or neck aches by monitoring the sitter’s posture based on the pressure data those data sensors read. It would be nice, although a bit nagging, for the chair to sound an alert when bad posture is setting in.
Obviously reading and interpreting sensor data from ReSkin is only part of the equation. The data from my chair example still needs to be processed by some computing device to warn me of potential aches from my posture or know that I’m in the chair and want the lights on. That processed data will need integration with the smart home to fire up the lights in this situation. So I’m not suggesting that “2022 will be the year of smart furniture.” I’m just trying to look beyond the obvious use cases of this technology.
And there are plenty in the smart home.
As a kid, I always got yelled at for messing with thermostat dials around the house. Add some ReSkin or similar technology to physical HVAC controls and you can ensure that making temperature changes requires a certain amount of hand pressure. Apply that concept to volume control and television remotes can quickly adjust sound output based on how much or how little pressure you use to control them.
Let’s go one step further and look at smart locks that use numeric keypads. I’d say they’re pretty secure as is, depending on the keycode used to configure the lock. But add pressure options to the number pad so that different numbers in the code are pressed at different pressure levels and you’ve got a much more secure front door.
I can envision a range of smart home use cases like these for ReSkin combined with machine learning models and connectivity. No, they won’t be arriving any time soon, but they’re out there, just waiting to be implemented. Until then, I’ll just have to enjoy the idea of robotic hands that can pick up the grape I dropped while sitting in my dumb chair.