This paper presents an investigation of collaborative localization for heterogeneous robots, resulting in a scheme for relative localization of a heterogeneous team of low-cost mobile robots. A novel complementary Kalman filter (CKF) approach is presented to address collaborative localization and mapping by optimally estimating the error states of the team. This indirect filter optimally combines the inertial/visual proprioceptive measurements of each vehicle with stereoscopic measurements made by unmanned ground vehicles (UGVs). An analysis is presented for both CKF and simultaneous localization and mapping (SLAM) approaches on maps containing randomly placed obstacles. In both simulation and experiments, we demonstrate the proposed methodology with a heterogeneous robot team. Six behavioral strategies, specifying the role and behavior of each robot, are simulated and evaluated for both CKF and SLAM approaches on maps containing randomly placed obstacles. Results show that the sources of error, image quantization, asynchronous sensors, and a nonstationary bias were sufficiently modeled to estimate the pose of the aerial robot. The results demonstrate localization accuracies of 2–4 cm, on average, while the robots operate at a distance from each other between 1 m and 4 m. The best performing behavior for the CKF approach maintained an average positional error of 2.2 cm and a relative error of 0.30% of the distance traveled for the entire team at the conclusion of maneuvers. For all multi-UGV strategies, the CKF approach outperformed the best SLAM results by a 6.7 cm mean error (0.48% of distance traveled).