3DAI® Engine


3DAI® Engine Components


Autonomous mobile platforms need to make sense of their surroundings. Univrses is developing versatile mapping solutions that can work with several moving agents in the same local space. Our software analyses data from a camera (in real time) to create a high-resolution map of the space around the platform. The map shows the 3D structure of the space and is extended as the camera view changes to see more of the platform’s surroundings. The mapping components within 3DAI® Engine can be adjusted to work with a variety of different sensors and are capable of outputting several data formats. They can be used to build a map of very small spaces (such as a room in a factory), larger spaces (such as a warehouse) and even outdoor environments. The technology developed by Univrses in its 3DAI® mapping products has already been deployed in different environments and applications, including industrial robotics.


Univrses’ 3DAI® Engine includes components to ”translate” the data in a 3D map to information that can be readily interpreted by a human. This enhances the utility of the map. Univrses’ 3DAI® Engine includes semantic interpretation solutions that can work using either 2D or 3D data. Univrses is actively using this technology to build semantic maps of the environment for autonomous driving applications. The 3DAI® Engine makes it possible for a vehicle to identify key landmarks within the 3D map being used for navigation, such as road signs, traffic lights and lane markings. 3DAI® Semantics can also be deployed on a mobile robot to understand its surroundings; for example, the location of a loading palette.


Adding a camera to a mobile platform (such as a car or a robot) enables the algorithms in Univrses’ 3DAI® Engine to track how that platform is moving. Our software analyses data from the camera (in real time) to show how the position (and the pose, or orientation) of the platform is changing relative to its surroundings. The positioning components within 3DAI® Engine are accurate, robust and require only low processing overhead. The technology developed by Univrses in its 3DAI® positioning products has already been deployed in real-world applications, including autonomous driving and domestic robotics.


The addition of a camera and Univrses’ 3DAI® Engine to a mobile platform enables localisation of the platform’s position relative to the map. Localising the position of the platform within a map complements positioning algorithms; it is another crucial building block for enabling robust higher levels of autonomy in robotic systems. The localisation components within Univrses’ 3DAI® Engine have already been deployed in real-world applications, including consumer products.


Sensor fusion is the process of combining sensory data from numerous sources into a single optimized result. Univrses’ 3DAI® Engine includes state-of-the-art sensor fusion components that can be deployed on a range of sensor types. Combining sensor data in this way is an integral part of solving robust Localization and a major part of generating accurate 3D maps.


A stationary camera will capture images of moving objects from different perspectives. Univrses’ 3DAI® Engine can determine the 3D structure of the objects by comparing these different perspectives. This is useful when, for example, examining objects for faults or defects. The 3D reconstruction components within 3DAI® Engine are being developed to work under different lighting conditions with homogeneous and heterogeneous structures. They have already been deployed in a warehousing application and will soon be deployed for specific tasks within mining.