SYNTCOMP 2025 Results

The results of the 2025 edition of SYNTCOMP are in. First, we give a list of the winners for each one of the tracks ran this year: Parity games, LTL (realizability & synthesis), and finite-word LTL. Then, we give more details about running times and scores as well as some graphs. Finally, we link to the participants of the competition.

The data used comes from the TGCC jobs. All data, scripts, benchmarks, and (binaries of the) tools are archived and will be made available soon. Congratulations to the winners and thanks to all the participants!

The winners

Parity game realizability

  • Realizability: Knor

LTL realizability

  • Realizability: ltlsynt

LTL synthesis

  • Synthesis time: ltlsynt
  • Synthesis quality: semml

LTLf realizability

  • Realizability: ltlfsynt

Parity game Realizability

SolverSOLVEDWALLCLOCK TIMECPU timeMEMORY
knor_npp4296126606363642
knor_tl42950546665872
knor_tlq (DISQUALIFIED)4296362629867459
ltlsynt359556255537512733

LTL Realizability

SOLVERSOLVEDWALLCLOCK TIMECPU TIMEMEMORY
sdf89754965.8877963.8130538
semml_real1318130507.16169790.134196118
ltlsynt-real-acd132871128.3870476.148551247
ltlsynt-real-lar133096028.5895261.66731324
ltlsynt-real-ds1225148534147510.032796639
acacia (DISQUALIFIED)1156143432.74405899.895924200

LTL Synthesis

SOLVERSOLVEDWALLCLOCK TIMECPU TIMEMEMORYMEAN QUALITY SCORE
ltlsynt-acd129799166.5298194.1355702521.46065726224849
ltlsynt-lar1296122680.21121552.6436594681.44114796838032
ltlsynt-ds1187151797.75150610.8866679251.58878496479921
semml_small1293153231.04204678.4844108771.91733469356956
semml_fast1295162855.38217312.5242523161.43764597172091
sdf87768625.2394958.9651409731.33605459519285
sdf-abc87362592.8590679.4451472081.58398914443439

LTLf Realizability

SOLVERSOLVEDWALLCLOCK TIMECPU TIMEMEMORY
ltlfsynt-bfs-os40221777.7821630.707121595
ltlfsynt-bfs39221677.5621528.201124140
ltlfsynt-dfs-os40118443.5818312.239105114
Cosy-m1 (DISQUALIFIED)35720287.2720166.786264615
Cosy-m2 (DISQUALIFIED)35611322.2811181.704156241
LydiaSynt30823019.5522842.95283175

Benchmarks and Solvers

The set of benchmarks used in 2025 can be fetched from the usual repository.

Below, you will find information regarding the participating solvers, their source code, license type, and links to their most recent reports or preferred publications to be cited in case you use the tool in your academic work.

SolverSource repoLicenseReport(S) to cite
ltl(f)syntGitLab repoGPL-3.0+Paper on ltlsynt, CIAA paper for ltlfsynt
Acacia bonsaiGitHub repoGPL-3.0Paper
KnorGitHub repoGPL-3.0Paper
sdfGitHub repoMITarXiv report
LydiaSyftGitHub repoAGPL v3.0Paper
SemMLGitLab repoGPL v3.0Paper
CosyTBATBAContact author