NODAR provides access to curated datasets optimized for developing and testing 3D vision applications. These datasets include stereo image pairs, calibration files, and ground truth depth maps, enabling precise evaluation and training. Ideal for developers and researchers, they streamline experimentation and accelerate the creation of innovative perception solutions.
All datasets are compatible with our NODAR Viewer software, enabling seamless replay and efficient evaluation.
Cameras are mounted in the upper left and right corners of vehicle’s windshield while driving on a highway in Germany. This data demonstrates that accurate point clouds to 200+ meters can be reconstructed on “textureless” asphalt roads even with motion blur at high speeds.
Cameras are mounted on a bar on top of the ego vehicle, which faces the target vehicle approaching from a faraway distance. The point cloud and BEV demonstrate that vehicles as far as 0.5 km away have dense depth data (~600 pixels), which is key to downstream perception tasks like object detection and tracking.
Data is recorded at an industrial facility, with wide variety of objects in the field of view including heavy machinery, vehicles, workers, storage containers, buildings, and pipes. The scene resembles a cluttered environment where network-based classification approaches tend to degrade due to the presence of out-of-class objects, while hammerhead continues to provide robust 3D perception of surrounding.
The data is recorded at night, with few street lights illuminating the scene, demonstrating a challenging condition due to significantly lower illuminance or lux. This require the cameras to capture images at enhanced exposures, while balancing the blur it introduces due to motion. The depth-maps reveal that our stereo-matching algorithm can identify features even in seemingly dark surfaces like the road, tree trunks, and curb.