General Information

The Reactive Synthesis Competition (SYNTCOMP) is a competition for reactive synthesis tools. The competition’s goal is to collect benchmarks in a publicly available library and foster research in new tools for automatic synthesis of systems. SYNTCOMP is organized annually (since 2014) as a satellite event of CAV.

Stay informed by subscribing to the SYNTCOMP mailing list.

SYNTCOMP 2023

The 2023 edition of SYNTCOMP was run on StarExec again. The results of the competition are available here. Starting this year, we also have LTLf tracks – that is, LTL (TLSF) over finite words.

SYNTCOMP 2022

The latest edition of SYNTCOMP was again run on the StarExec platform. We present the results of the competition (in a new, extended format!) here.

SYNTCOMP 2021

This year’s competition was again run on the StarExec platform. We were very excited by the number of new participating tools! The results of the competition and the collected tools and benchmarks can be found here.

SYNTCOMP 2020

SYNTCOMP 2020 was run on the StarExec platform with the same rules as SYNTCOMP 2019 and with a new parity-game track! The results of the competition and the collected tools and benchmarks can be found here.

SYNTCOMP 2019

From SYNTCOMP 2019 onward, the competition will run on the StarExec platform. The results of the competition and the collected tools and benchmarks can be found here.

SYNTCOMP 2018

SYNTCOMP 2018 was run with the same rules as SYNTCOMP 2017. 5 synthesis tools participated in the Safety (AIGER) track (plus 2 that ran hors concours), and 5 in the LTL (TLSF) track. Results and further information can be found here.

SYNTCOMP 2017

SYNTCOMP 2017 featured the same competitive categories as SYNTCOMP 2016, and introduced a quality ranking that improves on those that have been used in previous years. 5 synthesis tools participated in the safety (AIGER) track, and 5 in the LTL (TLSF) track. Results and further information can be found here.

SYNTCOMP 2016

SYNTCOMP 2016 extended the competition to specifications in full LTL. 6 synthesis tools competed in the existing track based on AIGER/safety specifications, and 3 tools competed in the new track based on specifications in TLSF/LTL. Results and further information can be found here.

SYNTCOMP 2015

For SYNTCOMP 2015, we have added more than 2000 problem instances to our benchmark library, and improved the ranking system and evaluation process. 4 synthesis tools competed on the new set of benchmarks, showing significant progress over last year’s implementations. Results and further information can be found here.

SYNTCOMP 2014

In the inaugural SYNTCOMP 2014, 5 synthesis tools competed on a set of 569 problem instances, collected in the new benchmark format of the competition. Results and further information can be found here.

Leave a Reply

Your email address will not be published. Required fields are marked *