Wheel Odometry to be added for Visual-Inertial Kinematics in upcoming software release
Wheel odometers are sensors that provide data on the rotation of a device’s wheels. Robots have used odometers to provide positioning information for years, providing direct feedback from a robot’s own systems, rather than sensing the world around it. By measuring the number and speed of wheel rotations, relatively simple maths can be applied to provide an estimate of translation, or distance travelled. Knowing the distance between a set of two wheels and the difference in rates of movement adds the capability to calculate rotation and thus estimate direction.
Although it is relatively common to integrate odometers loosely with other sensors including Lidar and IMUs, our system tightly integrates wheel odometry with visual, inertial sensors providing a flexible, commercial-grade SLAM software product. SLAMcore’s approach stands out because our sensor fusion algorithm intelligently blends all three data streams together rather than calculating averages across each sensor. Combining three sensor types in this way reinforces and greatly improves the accuracy and robustness of our Simultaneous Localization and Mapping (SLAM) estimates.
On its own, wheel odometry provides very ‘noisy’ data with high error rates which are not suitable for accurate SLAM. Skidding, uneven surfaces, weight changes, deformation of wheels and wear and tear on tyres can all reduce the accuracy of wheel odometers. But, as part of an array of sensors, odometers can deliver significant advantages to robot location systems. How they are integrated with other sensors and the algorithms processing data can make a huge difference to the overall accuracy, reliability and effectiveness of location estimates. Which is why we believe our sensor fusion approach will be of great value to developers. Using established ROS middleware almost any wheel sensors can now be added to the tightly-coupled visual inertial sensors including the Intel RealSense Depth Camera D435i which we support out of the box, delivering accurate internal and external data for use in SLAM. This now opens up the opportunity to use visual SLAM as an accurate and more flexible solution than many expensive LIDAR systems in use today.
The strengths in each sensor’s data offset weaknesses in the others. As an internal sensor, wheel odometry is especially useful to provide definitive data when sensing the external environment can be problematic. For example, in dynamic situations when lots of movement in the scene can confuse visual data, the IMU and odometry data can be upweighted for better SLAM estimation. If the wheels have not moved, it is safe to assume the robot’s position has not changed. Knowing this can increase speed of SLAM estimation by constraining the range of potential positions and therefore size of map and visual inertial data that need to be processed to reconfirm location. The combination increases reliability, accuracy and speed of location.
Adding wheel odometry to the range of sensors fused in SLAMcore’s algorithms has significant benefits for developers looking for robust, accurate and fast SLAM functionality to add into their designs. Fusing data from the three sensor types also makes calibration of wheel odometry faster and more straightforward. Typically, this is done in customized calibration suites which use pre-specified routes and test patterns to adjust sensors. Our combination of the three sensors can work together to calibrate wheel sensors in one pass in any suitable location. We estimate that not only will this reduce calibration time by up to a factor of ten but will allow calibration at customer sites, simplifying the manufacturing and on-boarding process as robots go into commercial production.
SLAMcore is committed to providing powerful yet flexible SLAM software so that developers can quickly and easily add accurate and robust location and mapping capabilities to their robot and consumer electronics designs. Adding wheel odometry adds another key sensor into our flexible visual-inertial kinematics system for SLAM. Tightly integrated to work together with visual inertial sensors, the performance of the overall system is improved meaning the whole is certainly greater than the sum of the parts.
Support for wheel odometry will be made available in an upcoming release of SLAMcore’s software in Q1 2021.
🤖
Read more at blog.slamcore.com