Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a system — RF-Diary — that they say can detect and caption the behaviors of people within a room from radio signals. They claim their approach can observe people through walls and other occlusions even in complete darkness, and that it learns to track those people’s interactions with objects like cups of water.
While the work has obvious surveillance applications, a fact the researchers are aware of and built protections against, the primary motivation was developing a monitoring system for health-impaired family members. Elderly residents might suffer from memory problems that cause them to forget things like whether they took certain medications or brushed their teeth, as well as from physical ailments that make them prone to injuring themselves. RF-Diary could be configured to provide regular updates to caregivers, the coauthors say, allowing them to administer remote care while providing peace of mind.
RF-Diary draws on radio frequency (RF) signals to capture object-level information. A radio with two antenna arrays oriented vertically and horizontally (each with 12 antennas) transmits a waveform and sweeps a frequency range, providing angular and depth information. From the RF data, an algorithm extracts features to estimate the position of people while a separate algorithm derives insights from a room’s floormap. (A third algorithm generates captions.) The floormap — which is marked with the size and location of objects like “bed,” “sofa,” TV,” and “fridge,” as determined by a laser measure — provides information about the surrounding environment, enabling the system to infer interactions with objects. Each object’s location is encoded with a person’s location to allow DF-Diary to pay varying attention to objects depending on their proximity to the person.
To train the system, the researchers collected a corpus of synchronized RF signals, videos from a 12-viewpoint camera system, floormaps, and captions labeled by Amazon Mechanical Turk workers to describe behaviors performed by volunteers. In total, the data set comprised 1,035 30-second clips across 10 indoor environments (such as a bedroom, kitchen, living room, lounge, and office) capturing 157 actions and 38 objects that were interacted with.
In the interest of privacy, the researchers had the people being monitored agree to perform sets of moves. They then asked them to walk around their living spaces, ensuring the system couldn’t be used to monitor areas that the volunteers didn’t have access to.
In experiments, the researchers report RF-Diary was able to generate captions about a person’s activities with greater than 90% accuracy on over 30 different actions including sleeping, reading, cooking, and watching television. Moreover, the system managed to generalize to new people and homes it hadn’t seen before.
The researchers plan to adapt the system to work in real-world homes and hospitals as a next step, with the goal of making DF-Diary into a commercial product. Indeed, in light of recent U.S. Food and Drug Administration guidance to expand the use of remote monitoring devices that facilitate patient management, hospitals and health systems are piloting such AI solutions that promise to minimize health workers’ exposure while improving health outcomes. Clinicians recently used a device developed by a different MIT CSAIL team — Emerald — to remotely monitor a patient’s breathing, movement, and sleep patterns. Orion Health released a platform that will allow providers to identify patients at risk of contracting COVID-19.
In the nearer term, the RF-Diary team is scheduled to present its work at the European Conference on Computer Vision (ECCV) 2020, which runs from August 23 to August 27.