Remotely Processed Visual and Odometric SLAM


Omer Mano
Wayne Chang
Eric Wengrowski

Prof. Kristin Dana

To design and build a robotic mapping system that collects visual and odometric data, which is remotely processed by a server for localization and reconstruction.


Robot localization is extremely important for any autonomous mobile or mapping system. Often relying solely on motor control to determine robot position is extremely unreliable, especially in the steady state. SLAM (Simultaneous Localization and Mapping) is a systematic method of triangulating robot position based on the movement of visual landmarks. However SLAM estimations usually contain positional uncertainties that are significant in the transient state.


More robust positional awareness can be achieve through a hybrid approach of visual SLAM and rotary encoder odometry data. With a more reliable understanding of robot position, the relative distances of landmarks can be more accurately computed, and more precise maps of the robot's environment can be generated.