, or simultaneous localization and mapping, is a well-studied computational problem that involves updating an environment map while keeping track of an agent’s location. (Typically, said agent is a drone or robot.) The advent of cheap, ubiquitous depth sensors and sophisticated algorithms have solved it somewhat, but even state-of-the-art vision systems aren’t perfect: Symmetric and repetitive patterns sometimes result in faulty maps, and the aforementioned sensors tend to generate large, unwieldy volumes of data.

That’s why researchers propose using Wi-Fi sensing to supplement the technique. A newly published paper on the preprint server Arxiv.org (““) describes a “generic way” to integrate wireless data into visual SLAM algorithms, toward the goal of improving their accuracy and minimizing hardware overhead.

“Wi-Fi radio is available on most robots

Read More At Article Source | Article Attribution