Users of Augmented Reality Headsets can See Hidden Objects

Users of Augmented Reality Headsets can See Hidden Objects

AR headsets are wearable devices that overlay digital content onto the real world, improving the user’s perception of their surroundings. While augmented reality technology can provide various visual enhancements, such as displaying virtual objects or information in the user’s field of view, seeing hidden objects is not a standard feature of AR headsets.

The researchers created the X-AR augmented reality headset, which combines computer vision and wireless perception to detect hidden objects in a room and then guides the wearer to retrieve the desired item.

MIT researchers have developed an augmented reality headset that provides the wearer with X-ray vision. The headset combines computer vision and wireless perception to automatically locate a specific item that is hidden from view, such as inside a box or under a pile, and then guides the user to retrieve it.

The system employs radio frequency (RF) signals, which can pass through common materials such as cardboard boxes, plastic containers, or wooden dividers, to locate hidden items labeled with RFID tags, which reflect signals sent by an RF antenna.

The headset guides the wearer as they walk through a room toward the location of the item, which appears as a transparent sphere in the augmented reality (AR) interface. Once the item is in the user’s hand, the headset, called X-AR, verifies that they have picked up the correct object.

When the researchers tested X-AR in a warehouse-like setting, the headset was able to locate hidden items to within 9.8 centimeters on average. It also confirmed that users picked up the correct item with 96% accuracy.

Our whole goal with this project was to build an augmented reality system that allows you to see things that are invisible – things that are in boxes or around corners – and in doing so, it can guide you toward them and truly allow you to see the physical world in ways that were not possible before.

Fadel Adib

X-AR could help e-commerce warehouse workers find items quickly on cluttered shelves or buried in boxes, or it could identify the exact item for an order when many similar objects are in the same bin. It could also be used in a manufacturing facility to assist technicians in locating the correct parts for an assembly.

“Our whole goal with this project was to build an augmented reality system that allows you to see things that are invisible – things that are in boxes or around corners – and in doing so, it can guide you toward them and truly allow you to see the physical world in ways that were not possible before,” says Fadel Adib, who is an associate professor in the Department of Electrical Engineering and Computer Science, the director of the Signal Kinetics group in the Media Lab, and the senior author of a paper on X-AR.

Adib’s co-authors are research assistants Tara Boroushaki, who is the paper’s lead author; Maisy Lam; Laura Dodds; and former postdoc Aline Eid, who is now an assistant professor at the University of Michigan. The research will be presented at the USENIX Symposium on Networked Systems Design and Implementation.

Augmented reality headset enables users to see hidden objects

Augmenting an AR headset

The researchers had to first outfit an existing headset with an antenna capable of communicating with RFID-tagged items in order to create an augmented reality headset with X-ray vision. Most RFID localization systems employ multiple antennas spaced meters apart, but the researchers required a single lightweight antenna with sufficient bandwidth to communicate with the tags.

“Designing an antenna that would fit on the headset without covering any of the cameras or obstructing their operation was a major challenge.” This is critical because we need to use all of the specs on the visor,” Eid explains.

The team experimented with a simple, lightweight loop antenna by tapering it and adding gaps, both of which increase bandwidth. Because antennas typically operate in the open air, the researchers optimized it for signal transmission and reception when attached to the headset’s visor.

After constructing an effective antenna, the team concentrated on using it to locate RFID-tagged items. They used a technique known as synthetic aperture radar, which is similar to how airplanes image ground objects. As the user moves around the room, X-AR takes measurements with its antenna from various vantage points and then combines those measurements. As a result, it functions similarly to an antenna array, where measurements from multiple antennas are combined to localize a device.

X-AR utilizes visual data from the headset’s self-tracking capability to build a map of the environment and determine its location within that environment. As the user walks, it computes the probability of the RFID tag at each location. The probability will be highest at the tag’s exact location, so it uses this information to zero in on the hidden object.

“While it presented a challenge when we were designing the system, we found in our experiments that it actually works well with natural human motion. Because humans move around a lot, it allows us to take measurements from lots of different locations and accurately localize an item,” Dodds says.

Once X-AR has located the item and the user picks it up, the headset must confirm that the user has picked up the correct object. However, because the user is still standing and the headset antenna is not moving, SAR cannot be used to locate the tag.

However, as the user picks up the item, the RFID tag follows. X-AR can measure the motion of the RFID tag and use the headset’s hand-tracking capability to locate the item in the user’s hand. The tag is then checked to see if it is sending the correct RF signals to ensure that it is the correct object.

The researchers used the headset’s holographic visualization capabilities to display this information to the user in a simple manner. After putting on the headset, the user navigates through menus to select an object from a database of tagged items. After the object has been localized, it is surrounded by a transparent sphere, allowing the user to see its location in the room. The device then projects the trajectory to that item as footsteps on the floor, which can dynamically update as the user walks.

Testing the headset

To put X-AR to the test, the researchers built a simulated warehouse out of cardboard boxes and plastic bins, then filled it with RFID-tagged items. They discovered that X-AR can guide the user toward a targeted item with less than 10 centimeters of error, which means that the item was found less than 10 centimeters from where X-AR directed the user on average. The baseline methods tested by the researchers had a median error of 25 to 35 centimeters.

They also discovered that it correctly verified that the user had selected the correct item 98.9% of the time. This means that X-AR can reduce picking errors by 98.9%. Even when the item was still in its box, it was 91.9 percent accurate.

“The system does not need to see the item to verify that you have picked up the correct item.” If you have ten different phones in similar packaging, you might not be able to tell them apart, but it can help you pick the right one,” Boroushaki says.

After demonstrating the success of X-AR, the researchers intend to investigate how different sensing modalities, such as WiFi, mmWave technology, or terahertz waves, can be used to improve its visualization and interaction capabilities. They could also improve the antenna’s range beyond 3 meters, allowing the system to be used by multiple, coordinated headsets.