While looking for new research on robotics last week, I caught the news from Facebook — I mean, Meta — about its new flexible sensor. It unveiled what it calls ReSkin technology for tactile sensing, created in conjunction with Carnegie Mellon University researchers. I think this technology could have a lot of potential in the smart home.
ReSkin is essentially a low-cost ($6 per 100) flexible sensor meant to mimic the sense of touch from human skin. Because it can measure tactile pressure, non-human hands such as those of robots can pick up fragile objects without crushing them. Here’s a video containing several ReSkin demonstrations, including one of a robotic gripper picking up grapes without turning them into pulpy juice.
This isn’t the first time we’ve seen tactile technology. Earlier this year, the MIT Computer Science and Artificial Laboratory (CSAIL) showed off similar elastic sensors for robotic arms. Where ReSkin differs, however, is in how it gets data to its sensors.
Movement can cause tactile sensors to wear out over time. To prevent that from happening, Meta’s ReSkin uses pressure to separate the tactile sensing material from the chip that reads and interprets the data.
Typically, a sensor requires some type of physical electrical connection between the two, which makes the touch-sensitive material difficult to replace. But since ReSkin uses magnetic signals to pass the “skin’s” data to a chip, replacing the tactile skin is easily done.
“Because ReSkin is a deformable elastomer with embedded magnetic particles, when it deforms in any way, the surrounding magnetic signal changes. We can measure these changes with nearby magnetometers and use data-driven techniques to translate this data into information such as contact location and amount of applied force.”
And because ReSkin can deform, much like skin, it doesn’t just provide pressure data for a single spot; it can display multiple pressure points of varying levels.
So what the heck does ReSkin have to do with the smart home?
At the moment, nothing. For now, ReSkin is just a research project. However, all of the related design, relevant documentation, code, and base models for machine learning will be released to the public. There’s already an open source GitHub repository to use ReSkin. That opens up a world of possibilities that up until now haven’t been available in the smart home.
Here’s one example. At some point almost every day, I sit in the same chair, either to work or to kick back and watch TV on a mobile device. If I take my seat at nighttime, I have to ask one digital assistant or another to switch on the lamp next to my chair. Sure, I could use a motion sensor. Or I could rig up some very basic pressure sensors and make my own “smart furniture” so the light goes on when I sit down. (I’ve tried the latter, but it’s not a bulletproof solution due to the lack of precision.)
Now imagine what would happen if there were ReSkin sensors in the seat cushion instead. Not only would the chair know that someone has sat in it, but the amount of weight on the chair would signify who was sitting in it. That, in turn, would offer context as to what should happen next with the lights — I like daylight bulb colors, for example, while my wife prefers soft white. So in this case, the ability to identify who is sitting in the chair would enable a more personal smart home experience.
In this particular case, there would also be a potential wellness implication. Let’s say the ReSkin sensors are in both the chair’s seat cushion and the backrest. The chair would be able to prevent certain back or neck aches by monitoring the sitter’s posture based on the pressure data those data sensors read. It would be nice, although a bit nagging, for the chair to sound an alert when bad posture is setting in.
Obviously reading and interpreting sensor data from ReSkin is only part of the equation. The data from my chair example still needs to be processed by some computing device to warn me of potential aches from my posture or know that I’m in the chair and want the lights on. That processed data will need integration with the smart home to fire up the lights in this situation. So I’m not suggesting that “2022 will be the year of smart furniture.” I’m just trying to look beyond the obvious use cases of this technology.
And there are plenty in the smart home.
As a kid, I always got yelled at for messing with thermostat dials around the house. Add some ReSkin or similar technology to physical HVAC controls and you can ensure that making temperature changes requires a certain amount of hand pressure. Apply that concept to volume control and television remotes can quickly adjust sound output based on how much or how little pressure you use to control them.
Let’s go one step further and look at smart locks that use numeric keypads. I’d say they’re pretty secure as is, depending on the keycode used to configure the lock. But add pressure options to the number pad so that different numbers in the code are pressed at different pressure levels and you’ve got a much more secure front door.
I can envision a range of smart home use cases like these for ReSkin combined with machine learning models and connectivity. No, they won’t be arriving any time soon, but they’re out there, just waiting to be implemented. Until then, I’ll just have to enjoy the idea of robotic hands that can pick up the grape I dropped while sitting in my dumb chair.
Sort of. Tactile sensing is only part of the equation for realizing robotic manipulation that mimics the human hand. It’s really a sensor fusion problem on the input side and a complex mechatronics problem on the output side (many actuators moving in carefully controlled small movements). If you could wire up the muscles on your body to see how many tiny muscles are firing when you do something like pick up four M&M’s out of a deep jar, it might blow your mind. 😉
Very interesting! Since I am quadriparetic with very little hand strength, I always cringe at the thought of any automation controls that require even more hand pressure, or precise levels of hand pressure. A lot of the accessibility requirements for the ADA for buttons and dials are all about REDUCING the pressure and precision required. (In fact, I have a voice controlled microwave specifically because it’s difficult for me to press the buttons.) So this hard press/soft press home automation “solution” is one I really hope doesn’t get widely adopted. In fact, at our house at one point we switched from Schlage to Yale smart locks because Schlage “touch” pads use a technology that requires an actual push while Yale uses capacitive screens that just require a touch.
Although my disability is rare, there are millions of seniors with some hand limitations and a significant gadget market right now in devices that require LESS finger strength to operate. So I’m just not seeing a hard press concept as having the same market opportunities as touch control.
With regard to a pressure pad sensing someone in a chair or bed, there’s lots of existing cheap technology that works well for this, and it doesn’t require the kind of precision ReSkin offers.
And for most of the other use case suggestions, it would be difficult for regular people to precisely and consistently match the tiny pressure changes that would require ReSkin vs existing technologies.
So instead, I see the probable automation use cases for ReSkin mapping directly to the examples the creators have provided: situations that in a human would require fine motor control. Many will probably be industrial, the factory situations where we still have to use humans because machines don’t yet match their dexterity. Chip manufacture is an obvious example.
Some potential healthcare/disability support use cases as well. Changing dressings, that sort of thing. Did you know that for 40 years there was one small organization, Helping Hands, that trained helper monkeys for those who can’t move their arms? The monkeys could wash the person’s face, brush their hair, put on their glasses, scratch an itch: all things that robots can’t do precisely because they lack the sensitivity ReSkin could provide. But the monkeys had limits because of their own potential health issues. Contagion is a two way street. The organization decided to stop training new monkey helpers in 2021 and is just beginning to work cooperatively with AI and robotics companies to see if robots could do what the monkeys did. ReSkin could be a huge factor in those projects.
For a much wider market, there is a large demand for something which does not yet exist: a moderately priced automatic toenail clipper. Many people have difficulty cutting their own toenails, professional pedicures are expensive, and often people are uncomfortable asking family members to help. A safe and simple moderately priced (say $125 or so) automatic toenail clipper would sell a lot of units and seems like something ReSkin might help make possible. Look at the automatic nail cutter models that currently sell on Amazon. Most work ok on fingernails but not toenails, and there is clearly a lot of demand.
So I would look for use cases that replace hand movements rather than ones that require humans to exactly match their hand movements to what the machine requires.
Just a thought.