Call for SLAM Challenge


This SLAM-for-AR competition is sponsored exclusively by SenseTime Group Limited.


SLAM for AR Competition @ ISMAR 2019


As one of the most important techniques for AR applications, SLAM has achieved the level of maturity, and entered the stage of product landing. At this stage, more and more efforts are being made to improve the overall performance of a SLAM system rather than some individual technical indicators. In addition, compared to other applications like robotics, landing on AR products poses higher requirement to handle a variety of challenging situations since a home user may not carefully move the AR device, and the real environment may be quite complex.


In this context, this year we launch a SLAM competition specifically designed for AR applications, emphasizing on the overall performance of SLAM systems. We provide a visual-inertial dataset which mimics the usage scenarios of AR. The visual-inertial data are captured by a mobile phone (i.e. Xiaomi Mi 8), and the ground-truth data is obtained by a VICON motion capture system. All sensor data are calibrated and synchronized. Based on this dataset, we evaluate the overall performance of a SLAM system considering tracking accuracy, initialization quality, tracking robustness, relocalization time and the computation efficiency.


If you are interested in this SLAM for AR competition, please fill the registration form and send it to If you have any requests or suggestions, please feel free to contact us.


Visual SLAM and visual-inertial SLAM will be evaluated separately. In both category, there will be three winners with corresponding prizes:

1st winner : 3000 US dollars

2nd winner: 1500 US dollars

3rd winner: 750 US dollars

The teams in the finalists will be invited to present their works at SLAM for AR Competition workshop. We also encourage the invited teams to give a live demo on site.

 Important Dates

l Training dataset (ground-truth included) release – August 6th, 2019

Download the training dataset: The data format is the same as the competition dataset.

Develop and improve your algorithm with the training dataset.

Evaluation tool:

First Round – August 6th-September 20th, 2019-Deadline has been extended! 

Register for the competition, and you will receive the download link of the competition dataset.

Run your system on the competition dataset and submit the result (email to before the deadline.

Write at least two-page description of your system, and submit it before the deadline.

Explanation of your system.

Explanation of the novelties.

u Any format is OK, but we recommend you to use the template (TEX, DOC) from IEEE VGTC:

lVGTC LaTeX Template

lVGTC Word Template

Cite the related published papers (could be arxiv papers) of yours, if any. If you already have a paper that exactly describes your system, you can directly submit your paper.

Teams with highest scores will be notified and invited to attend the final round and present their works on the workshop.

Final Round - Sep -Oct., 2019

More competition data will be used for the final round. The running time will be also taken into account.

The executable systems from each team shall be provided and run on our benchmarking PC with a controlled environment (Linux or Windows).

Workshop – Oct. 14th, 2019

Each team will give a presentation about the techniques used in their system.

Final ranks and winners will be announced in the Closing Session on Oct. 17th, 2019.



We require your system should be able to run at 30FPS or above averagely on a normal desktop PC (e.g. an Intel i7 CPU with a single NVIDIA graphics card, and the memory space is not larger than 16G).

l If the speed of your system is less than 30FPS on our benchmarking PC, the final score will be lowered.

Submission Format & Evaluation Criteria

The estimated 6 DoF camera poses (from camera coordinate to the world coordinate) and running time are required to evaluate the performance.

We evaluate the overall performance of a SLAM system considering tracking accuracy, initialization quality, tracking robustness, relocalization time and the computation efficiency. The running time will be taken into account in the final round competition.

Please refer to evaluation instruction file for the details.



Competition Chairs

Guofeng Zhang

Zhejiang University, China

Jing Chen

Beijing Institute of Technology, China

Guoquan Huang

University of Delaware, USA


Thanks to our sponsors

We thank our sponsors for supporting the ISMAR 2019 conference.


Support Units