Accurate and robust extrinsic calibration is necessary for deploying autonomous systems which need multiple sensors for perception. In this letter, we present a robust system for real-time extrinsic calibration of multiple lidars in vehicle base framewithout the need for any fiducialmarkers or features. We base our approach on matching absolute GNSS (Global Navigation Satellite System) and estimated lidar poses in real-time. Comparing rotation components allows us to improve the robustness of the solution than traditional least-square approach comparing translation components only. Additionally, instead of comparing all corresponding poses, we select poses comprising maximum mutual information based on our novel observability criteria. This allows us to identify a subset of the poses helpful for real-time calibration. We also provide stopping criteria for ensuring calibration completion. To validate our approach extensive tests were carried out on data collected using Scania test vehicles (7 sequences for a total of approximate to 6.5 Km). The results presented in this letter show that our approach is able to accurately determine the extrinsic calibration for various combinations of sensor setups.