MINOS: Multimodal Indoor Simulator for Navigation in Complex Environments
arXiv preprint, December 2017
Abstract
We present MINOS, a simulator designed to support the development of
multisensory models for goal-directed navigation in complex indoor
environments. The simulator leverages large datasets of complex 3D environments
and supports flexible configuration of multimodal sensor suites. We use MINOS
to benchmark deep-learning-based navigation methods, to analyze the influence
of environmental complexity on navigation performance, and to carry out a
controlled study of multimodality in sensorimotor learning. The experiments
show that current deep reinforcement learning approaches fail in large
realistic environments. The experiments also indicate that multimodality is
beneficial in learning to navigate cluttered scenes. MINOS is released
open-source to the research community at http://minosworld.org . A video that
shows MINOS can be found at https://youtu.be/c0mL9K64q84
Citation
Manolis Savva, Angel X. Chang, Alexey Dosovitskiy, Thomas Funkhouser, and Vladlen Koltun.
"MINOS: Multimodal Indoor Simulator for Navigation in Complex Environments."
arXiv:1712.03931, December 2017.
BibTeX
@techreport{Savva:2017:MMI, author = "Manolis Savva and Angel X. Chang and Alexey Dosovitskiy and Thomas Funkhouser and Vladlen Koltun", title = "{MINOS}: Multimodal Indoor Simulator for Navigation in Complex Environments", institution = "arXiv preprint", year = "2017", month = dec, number = "1712.03931" }