

Robust monocular depth estimation is essential for various applications relying on visual cues to understand the real world. To ensure robustness, unsupervised domain adaptation is widely used for monocular depth estimation. Despite recent advances, existing methods often struggle in outdoor environments due to adverse environmental conditions and limited datasets. Intentionally corrupted images obtained from real images captured in clear weather conditions for unsupervised domain adaptation often fail to accurately represent the complex characteristics of diverse environments, leading to unrealistic training data. From this viewpoint, simulation data offering more plausible representation of adverse weather conditions are used. However, it still presents drawbacks due to potentially degrading adaptation capabilities. To address the limitations of using simulation data, we propose a wild-condition pass filtering module that extracts wild-condition features and captures cross-domain relationships from both real and simulation datasets. This enables comprehensive learning of different conditions from each dataset and improved performance on real adversarial target images. The proposed method achieves a notable 22% improvement over the baseline on the Foggy Cityscapes dataset, highlighting the importance of employing realistic domain adaptation techniques to effectively address the challenges posed by adverse environmental conditions. The code is available at unmapped: uri https://github.com/JH2-LEE/wide.