If you are looking to jumpstart projects that integrate Spatial Intelligence into autonomous mobile robots or to retrofit visual spatial awareness capabilities to existing manually operated fleets, then register to access the Slamcore SDK.

The market for mobile robots in warehouses is booming – yet in most cases, these robots must still work alongside manually driven vehicles, existing autonomous guided vehicles following set paths, and human pedestrians. Many are realizing that visual spatial intelligence is the secret to managing all of this safely and efficiently.

LIDAR, UWB Beacons, fiducials, and other technologies already provide highly capable location and mapping functions. But adding vision to automation stacks can increase efficiency, safety, and speed of operations, especially in circumstances where AMRs, manually driven vehicles, and pedestrians share common areas. With a wider field of view, stereo cameras capture more information to create more detailed, accurate, and robust 3D maps and support the semantic labeling of objects for enhanced perception and real-time management.

But finding the right combination of hardware, sensors, and algorithms to deliver real-time, accurate, and robust mapping and localization can be complex, time-consuming, and costly. The Slamcore SDK can accelerate integration, testing, and deployment of visual Spatial Intelligence as part of a navigation or wider autonomy stack. The Slamcore algorithms also have the power to add semantic labeling so that autonomous machines and vehicles can perceive the differences between objects and modify routes and operations accordingly.

Registrations for the Slamcore SDK are open, and those interested in accessing and downloading the kit should visit www.slamcore.com/sdk to register. 

Ends

🤖