Results of the competion
1 Results over all benchmarks
In this section you can find tables containing the summarized global results and results for the different categories. The report of the competition can be downloaded here. After the competition, a few errors
were discovered. They are discussed here.
Multi-system teams are marked in red.
1.1 Global results
Result table for ranking "Global" |
Place | Team | Score | Time (s) |
1 | Potassco | 0.88 | 103925 |
2 | Claspfolio | 0.77 | 156113 |
3 | DLV | 0.71 | 201249 |
4 | IDP | 0.63 | 218284 |
5 | Smodels-IE | 0.56 | 268783 |
6 | bpsolver-CLP(FD) | 0.49 | 279437 |
7 | Cmodels | 0.41 | 237661 |
8 | LP2SAT+MINISAT | 0.39 | 242378 |
9 | Sup | 0.38 | 250581 |
10 | LP2DIFF+BCLT | 0.36 | 275653 |
11 | LP2DIFF+YICES | 0.35 | 264078 |
12 | Enfragmo | 0.32 | 312339 |
13 | sabe | 0.23 | 378650 |
14 | pbmodels | 0.21 | 384508 |
15 | ASPeRiX | 0.1 | 459640 |
16 | amsolver | 0.05 | 467903 |
1.2 Decision problems
Result table for ranking "All decision problems" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 0.95 | 585 / 622 = 94% | 29607 |
2 | Claspfolio | 0.84 | 511 / 622 = 82% | 77780 |
3 | Cmodels | 0.82 | 498 / 622 = 80% | 99721 |
4 | DLV | 0.81 | 497 / 622 = 79% | 108448 |
5 | LP2SAT+MINISAT | 0.79 | 490 / 622 = 78% | 104438 |
6 | Sup | 0.77 | 465 / 622 = 74% | 112641 |
7 | IDP | 0.75 | 450 / 622 = 72% | 117223 |
8 | LP2DIFF+BCLT | 0.72 | 438 / 622 = 70% | 137713 |
9 | LP2DIFF+YICES | 0.7 | 432 / 622 = 69% | 126138 |
10 | bpsolver-CLP(FD) | 0.63 | 365 / 622 = 58% | 165902 |
11 | Smodels-IE | 0.62 | 369 / 622 = 59% | 165607 |
12 | Enfragmo | 0.6 | 348 / 622 = 55% | 190741 |
13 | pbmodels | 0.42 | 243 / 622 = 39% | 248505 |
14 | sabe | 0.39 | 234 / 622 = 37% | 261961 |
15 | ASPeRiX | 0.21 | 98 / 622 = 15% | 321700 |
16 | amsolver | 0.1 | 83 / 622 = 13% | 329963 |
1.2.1 Decision problems in P
Result table for ranking "Decision in P" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 1 | 89 / 89 = 100% | 735 |
2 | bpsolver-CLP(FD) | 1 | 89 / 89 = 100% | 1342 |
3 | DLV | 1 | 89 / 89 = 100% | 4861 |
4 | Claspfolio | 0.8 | 60 / 89 = 67% | 17982 |
5 | Smodels-IE | 0.8 | 60 / 89 = 67% | 18021 |
6 | LP2SAT+MINISAT | 0.8 | 60 / 89 = 67% | 18270 |
7 | Sup | 0.8 | 60 / 89 = 67% | 18606 |
8 | LP2DIFF+BCLT | 0.8 | 60 / 89 = 67% | 18713 |
9 | Cmodels | 0.8 | 60 / 89 = 67% | 19072 |
10 | LP2DIFF+YICES | 0.78 | 59 / 89 = 66% | 18864 |
11 | Enfragmo | 0.76 | 57 / 89 = 64% | 24157 |
12 | ASPeRiX | 0.69 | 66 / 89 = 74% | 18051 |
13 | IDP | 0.54 | 41 / 89 = 46% | 29594 |
14 | sabe | 0.41 | 31 / 89 = 34% | 36426 |
15 | pbmodels | 0.38 | 29 / 89 = 32% | 36656 |
16 | amsolver | 0 | 0 / 89 = 0% | 53845 |
1.2.2 Decision problems in NP
Result table for ranking "Decision in NP" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 0.97 | 491 / 516 = 95% | 21253 |
2 | Claspfolio | 0.89 | 451 / 516 = 87% | 49513 |
3 | Cmodels | 0.85 | 434 / 516 = 84% | 72283 |
4 | IDP | 0.83 | 409 / 516 = 79% | 77428 |
5 | LP2SAT+MINISAT | 0.82 | 430 / 516 = 83% | 75883 |
6 | Sup | 0.8 | 405 / 516 = 78% | 83749 |
7 | DLV | 0.76 | 391 / 516 = 75% | 100496 |
8 | LP2DIFF+BCLT | 0.73 | 378 / 516 = 73% | 108715 |
9 | LP2DIFF+YICES | 0.72 | 373 / 516 = 72% | 96989 |
10 | Smodels-IE | 0.61 | 309 / 516 = 59% | 137300 |
11 | Enfragmo | 0.59 | 291 / 516 = 56% | 156298 |
12 | bpsolver-CLP(FD) | 0.57 | 274 / 516 = 53% | 155559 |
13 | pbmodels | 0.44 | 214 / 516 = 41% | 201563 |
14 | sabe | 0.4 | 203 / 516 = 39% | 215250 |
15 | amsolver | 0.12 | 83 / 516 = 16% | 265833 |
16 | ASPeRiX | 0.12 | 32 / 516 = 6% | 293363 |
1.3 Optimization problems
Result table for ranking "Optimization" |
Place | Team | Score | Time (s) |
1 | Potassco | 81.12 | 74317 |
2 | Claspfolio | 69.61 | 78332 |
3 | DLV | 61.04 | 92801 |
4 | IDP | 50.88 | 101060 |
5 | Smodels-IE | 49.88 | 103176 |
6 | bpsolver-CLP(FD) | 35.8 | 113534 |
7 | sabe | 6.74 | 116689 |
8 | Enfragmo | 5.07 | 121598 |
9 | pbmodels | 1.19 | 136003 |
10 | Sup | 0 | 137940 |
11 | LP2SAT+MINISAT | 0 | 137940 |
12 | LP2DIFF+YICES | 0 | 137940 |
13 | LP2DIFF+BCLT | 0 | 137940 |
14 | ASPeRiX | 0 | 137940 |
15 | amsolver | 0 | 137940 |
16 | Cmodels | 0 | 137940 |
2 Results on individual benchmarks
2.1 Polynomially solvable problems(P)
2.1.1 Benchmark Hydraulicplanning
There were 15 instances for this benchmark, 1 of which were unsatisfiable. 15 teams submitted a solution for this benchmark. The 1 teams that did not participate were: amsolver. 14 of the 15 instances were solved by all of the teams.
Result table for benchmark "HydraulicPlanning" |
Place | Team | Score | Solved | Time (s) |
1 | Sup | 1 | 15 / 15 = 100% | 0 |
2 | Cmodels | 1 | 15 / 15 = 100% | 0 |
3 | ASPeRiX | 1 | 15 / 15 = 100% | 0 |
4 | bpsolver-CLP(FD) | 1 | 15 / 15 = 100% | 0 |
5 | Potassco | 1 | 15 / 15 = 100% | 0 |
6 | DLV | 1 | 15 / 15 = 100% | 0 |
7 | Smodels-IE | 1 | 15 / 15 = 100% | 0 |
8 | LP2SAT+MINISAT | 1 | 15 / 15 = 100% | 0 |
9 | LP2DIFF+BCLT | 1 | 15 / 15 = 100% | 1 |
10 | LP2DIFF+YICES | 1 | 15 / 15 = 100% | 1 |
11 | Enfragmo | 1 | 15 / 15 = 100% | 9 |
12 | Claspfolio | 1 | 15 / 15 = 100% | 19 |
13 | IDP | 1 | 15 / 15 = 100% | 101 |
14 | pbmodels | 1 | 15 / 15 = 100% | 105 |
15 | sabe | 0.93 | 14 / 15 = 93% | 708 |
16 | amsolver | 0 | 0 / 15 = 0% | 9075 |
2.1.2 Benchmark HydraulicLeaking
There were 15 instances for this benchmark, 0 of which were unsatisfiable. 15 teams submitted a solution for this benchmark. The 1 teams that did not participate were: amsolver. 12 of the 15 instances were solved by all of the teams.
Result table for benchmark "HydraulicLeaking" |
Place | Team | Score | Solved | Time (s) |
1 | bpsolver-CLP(FD) | 1 | 15 / 15 = 100% | 0 |
2 | ASPeRiX | 1 | 15 / 15 = 100% | 0 |
3 | DLV | 1 | 15 / 15 = 100% | 1 |
4 | Cmodels | 1 | 15 / 15 = 100% | 1 |
5 | Potassco | 1 | 15 / 15 = 100% | 1 |
6 | Sup | 1 | 15 / 15 = 100% | 1 |
7 | Smodels-IE | 1 | 15 / 15 = 100% | 1 |
8 | LP2SAT+MINISAT | 1 | 15 / 15 = 100% | 1 |
9 | LP2DIFF+BCLT | 1 | 15 / 15 = 100% | 1 |
10 | LP2DIFF+YICES | 1 | 15 / 15 = 100% | 1 |
11 | Claspfolio | 1 | 15 / 15 = 100% | 19 |
12 | IDP | 1 | 15 / 15 = 100% | 122 |
13 | Enfragmo | 0.8 | 12 / 15 = 80% | 1854 |
14 | pbmodels | 0.8 | 12 / 15 = 80% | 2099 |
15 | sabe | 0.8 | 12 / 15 = 80% | 2107 |
16 | amsolver | 0 | 0 / 15 = 0% | 9075 |
2.1.3 Benchmark CompanyControls
There were 15 instances for this benchmark, 0 of which were unsatisfiable. 6 teams submitted a solution for this benchmark. The 10 teams that did not participate were: ASPeRiX, Cmodels, Sup, Enfragmo, LP2DIFF+BCLT, LP2SAT+MINISAT, LP2DIFF+YICES, pbmodels, sabe, amsolver.
Result table for benchmark "CompanyControls" |
Place | Team | Score | Solved | Time (s) |
1 | Smodels-IE | 1 | 15 / 15 = 100% | 271 |
2 | Potassco | 1 | 15 / 15 = 100% | 272 |
3 | Claspfolio | 1 | 15 / 15 = 100% | 297 |
4 | LP2SAT+MINISAT | 1 | 15 / 15 = 100% | 309 |
5 | LP2DIFF+BCLT | 1 | 15 / 15 = 100% | 309 |
6 | LP2DIFF+YICES | 1 | 15 / 15 = 100% | 311 |
7 | DLV | 1 | 15 / 15 = 100% | 588 |
8 | Sup | 1 | 15 / 15 = 100% | 1000 |
9 | bpsolver-CLP(FD) | 1 | 15 / 15 = 100% | 1173 |
10 | Cmodels | 1 | 15 / 15 = 100% | 1466 |
11 | Enfragmo | 1 | 15 / 15 = 100% | 4203 |
12 | IDP | 0 | 0 / 15 = 0% | 9000 |
13 | pbmodels | 0 | 0 / 15 = 0% | 9075 |
14 | sabe | 0 | 0 / 15 = 0% | 9075 |
15 | ASPeRiX | 0 | 0 / 15 = 0% | 9075 |
16 | amsolver | 0 | 0 / 15 = 0% | 9075 |
2.1.4 Benchmark GammarBasedInformationExtraction
There were 29 instances for this benchmark, 15 of which were unsatisfiable. 8 teams submitted a solution for this benchmark. The 8 teams that did not participate were: Smodels-IE, Enfragmo, LP2DIFF+BCLT, LP2SAT+MINISAT, LP2DIFF+YICES, pbmodels, sabe, amsolver.
Result table for benchmark "GrammarbasedInformationExtraction" |
Place | Team | Score | Solved | Time (s) |
1 | bpsolver-CLP(FD) | 1 | 29 / 29 = 100% | 12 |
2 | Potassco | 1 | 29 / 29 = 100% | 304 |
3 | ASPeRiX | 1 | 29 / 29 = 100% | 3332 |
4 | DLV | 1 | 29 / 29 = 100% | 4122 |
5 | IDP | 0 | 0 / 29 = 0% | 17400 |
6 | Sup | 0 | 0 / 29 = 0% | 17400 |
7 | Claspfolio | 0 | 0 / 29 = 0% | 17400 |
8 | Cmodels | 0 | 0 / 29 = 0% | 17400 |
9 | pbmodels | 0 | 0 / 29 = 0% | 17545 |
10 | LP2SAT+MINISAT | 0 | 0 / 29 = 0% | 17545 |
11 | LP2DIFF+YICES | 0 | 0 / 29 = 0% | 17545 |
12 | LP2DIFF+BCLT | 0 | 0 / 29 = 0% | 17545 |
13 | sabe | 0 | 0 / 29 = 0% | 17545 |
14 | Smodels-IE | 0 | 0 / 29 = 0% | 17545 |
15 | amsolver | 0 | 0 / 29 = 0% | 17545 |
16 | Enfragmo | 0 | 0 / 29 = 0% | 17545 |
2.1.5 Benchmark Reachability
There were 15 instances for this benchmark, 0 of which were unsatisfiable. 15 teams submitted a solution for this benchmark. The 1 teams that did not participate were: amsolver. 2 of the 15 instances were solved by all of the teams.
Result table for benchmark "Reachability" |
Place | Team | Score | Solved | Time (s) |
1 | DLV | 1 | 15 / 15 = 100% | 148 |
2 | bpsolver-CLP(FD) | 1 | 15 / 15 = 100% | 154 |
3 | Potassco | 1 | 15 / 15 = 100% | 156 |
4 | Smodels-IE | 1 | 15 / 15 = 100% | 202 |
5 | Cmodels | 1 | 15 / 15 = 100% | 204 |
6 | Sup | 1 | 15 / 15 = 100% | 204 |
7 | Claspfolio | 1 | 15 / 15 = 100% | 246 |
8 | LP2SAT+MINISAT | 1 | 15 / 15 = 100% | 413 |
9 | Enfragmo | 1 | 15 / 15 = 100% | 545 |
10 | LP2DIFF+BCLT | 1 | 15 / 15 = 100% | 856 |
11 | LP2DIFF+YICES | 0.93 | 14 / 15 = 93% | 1005 |
12 | IDP | 0.73 | 11 / 15 = 73% | 2970 |
13 | ASPeRiX | 0.46 | 7 / 15 = 46% | 5642 |
14 | sabe | 0.33 | 5 / 15 = 33% | 6990 |
15 | pbmodels | 0.13 | 2 / 15 = 13% | 7831 |
16 | amsolver | 0 | 0 / 15 = 0% | 9075 |
2.2 NP-problems, not polynomially solvable
2.2.1 Benchmark BlockedNQueens
There were 29 instances for this benchmark, 14 of which were unsatisfiable. 16 teams submitted a solution for this benchmark.
Result table for benchmark "BlockedNQueens" |
Place | Team | Score | Solved | Time (s) |
1 | amsolver | 1 | 29 / 29 = 100% | 83 |
2 | Potassco | 1 | 29 / 29 = 100% | 136 |
3 | Cmodels | 1 | 29 / 29 = 100% | 288 |
4 | LP2DIFF+YICES | 1 | 29 / 29 = 100% | 538 |
5 | bpsolver-CLP(FD) | 1 | 29 / 29 = 100% | 586 |
6 | LP2SAT+MINISAT | 1 | 29 / 29 = 100% | 674 |
7 | IDP | 1 | 29 / 29 = 100% | 757 |
8 | Sup | 1 | 29 / 29 = 100% | 914 |
9 | Smodels-IE | 1 | 29 / 29 = 100% | 1069 |
10 | Claspfolio | 1 | 29 / 29 = 100% | 1474 |
11 | pbmodels | 1 | 29 / 29 = 100% | 1850 |
12 | LP2DIFF+BCLT | 1 | 29 / 29 = 100% | 2064 |
13 | Enfragmo | 1 | 29 / 29 = 100% | 2758 |
14 | DLV | 0.96 | 28 / 29 = 96% | 10966 |
15 | sabe | 0.93 | 27 / 29 = 93% | 4260 |
16 | ASPeRiX | 0 | 0 / 29 = 0% | 17400 |
2.2.2 Benchmark Sokoban
There were 29 instances for this benchmark, 20 of which were unsatisfiable. 13 teams submitted a solution for this benchmark. The 3 teams that did not participate were: bpsolver-CLP(FD), Enfragmo, amsolver.
Result table for benchmark "Sokoban" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 1 | 29 / 29 = 100% | 16 |
2 | IDP | 1 | 29 / 29 = 100% | 55 |
3 | LP2DIFF+YICES | 1 | 29 / 29 = 100% | 78 |
4 | Claspfolio | 1 | 29 / 29 = 100% | 84 |
5 | Cmodels | 1 | 29 / 29 = 100% | 100 |
6 | LP2DIFF+BCLT | 1 | 29 / 29 = 100% | 456 |
7 | LP2SAT+MINISAT | 1 | 29 / 29 = 100% | 508 |
8 | Sup | 1 | 29 / 29 = 100% | 1029 |
9 | DLV | 1 | 29 / 29 = 100% | 1761 |
10 | sabe | 0.89 | 26 / 29 = 89% | 6487 |
11 | Smodels-IE | 0.82 | 24 / 29 = 82% | 4759 |
12 | pbmodels | 0.13 | 4 / 29 = 13% | 15818 |
13 | ASPeRiX | 0 | 0 / 29 = 0% | 17400 |
14 | bpsolver-CLP(FD) | 0 | 0 / 29 = 0% | 17545 |
15 | amsolver | 0 | 0 / 29 = 0% | 17545 |
16 | Enfragmo | 0 | 0 / 29 = 0% | 17545 |
2.2.3 Benchmark 15Puzzle
There were 16 instances for this benchmark, 0 of which were unsatisfiable. 15 teams submitted a solution for this benchmark. The 1 teams that did not participate were: amsolver.
Result table for benchmark "15Puzzle" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 1 | 16 / 16 = 100% | 238 |
2 | Claspfolio | 1 | 16 / 16 = 100% | 609 |
3 | Cmodels | 1 | 16 / 16 = 100% | 1076 |
4 | LP2DIFF+YICES | 1 | 16 / 16 = 100% | 1113 |
5 | LP2SAT+MINISAT | 1 | 16 / 16 = 100% | 1377 |
6 | LP2DIFF+BCLT | 1 | 16 / 16 = 100% | 1852 |
7 | IDP | 0.93 | 15 / 16 = 93% | 1594 |
8 | pbmodels | 0.93 | 15 / 16 = 93% | 1888 |
9 | sabe | 0.93 | 15 / 16 = 93% | 2627 |
10 | bpsolver-CLP(FD) | 0.87 | 14 / 16 = 87% | 3260 |
11 | Sup | 0.81 | 13 / 16 = 81% | 3863 |
12 | Enfragmo | 0.5 | 8 / 16 = 50% | 6584 |
13 | DLV | 0.31 | 5 / 16 = 31% | 7859 |
14 | Smodels-IE | 0 | 0 / 16 = 0% | 9600 |
15 | ASPeRiX | 0 | 0 / 16 = 0% | 9600 |
16 | amsolver | 0 | 0 / 16 = 0% | 9680 |
2.2.4 Benchmark HamiltonianPath
There were 29 instances for this benchmark, 0 of which were unsatisfiable. 15 teams submitted a solution for this benchmark. The 1 teams that did not participate were: ASPeRiX.
Result table for benchmark "HamiltonianPath" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 1 | 29 / 29 = 100% | 3 |
2 | IDP | 1 | 29 / 29 = 100% | 4 |
3 | Cmodels | 1 | 29 / 29 = 100% | 7 |
4 | LP2DIFF+YICES | 1 | 29 / 29 = 100% | 9 |
5 | LP2DIFF+BCLT | 1 | 29 / 29 = 100% | 24 |
6 | Claspfolio | 1 | 29 / 29 = 100% | 39 |
7 | LP2SAT+MINISAT | 1 | 29 / 29 = 100% | 179 |
8 | Enfragmo | 1 | 29 / 29 = 100% | 377 |
9 | bpsolver-CLP(FD) | 1 | 29 / 29 = 100% | 1051 |
10 | Sup | 0.89 | 26 / 29 = 89% | 1815 |
11 | Smodels-IE | 0.86 | 25 / 29 = 86% | 2478 |
12 | DLV | 0.72 | 21 / 29 = 72% | 5988 |
13 | amsolver | 0.44 | 13 / 29 = 44% | 11176 |
14 | pbmodels | 0 | 0 / 29 = 0% | 17400 |
15 | sabe | 0 | 0 / 29 = 0% | 17400 |
16 | ASPeRiX | 0 | 0 / 29 = 0% | 17545 |
2.2.5 Benchmark SchurNumbers
There were 29 instances for this benchmark, 16 of which were unsatisfiable. 16 teams submitted a solution for this benchmark.
Result table for benchmark "Schurnumbers" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 1 | 29 / 29 = 100% | 644 |
2 | LP2DIFF+YICES | 1 | 29 / 29 = 100% | 1691 |
3 | Claspfolio | 0.96 | 28 / 29 = 96% | 1130 |
4 | amsolver | 0.96 | 28 / 29 = 96% | 1594 |
5 | Cmodels | 0.96 | 28 / 29 = 96% | 2032 |
6 | Enfragmo | 0.96 | 28 / 29 = 96% | 2426 |
7 | IDP | 0.93 | 27 / 29 = 93% | 2038 |
8 | LP2SAT+MINISAT | 0.93 | 27 / 29 = 93% | 3354 |
9 | pbmodels | 0.93 | 27 / 29 = 93% | 4229 |
10 | LP2DIFF+BCLT | 0.89 | 26 / 29 = 89% | 2287 |
11 | bpsolver-CLP(FD) | 0.89 | 26 / 29 = 89% | 3916 |
12 | Sup | 0.82 | 24 / 29 = 82% | 4685 |
13 | sabe | 0.62 | 18 / 29 = 62% | 8202 |
14 | DLV | 0.62 | 18 / 29 = 62% | 8409 |
15 | Smodels-IE | 0.2 | 6 / 29 = 20% | 14543 |
16 | ASPeRiX | 0 | 0 / 29 = 0% | 17400 |
2.2.6 Benchmark TravellingSalesPerson
There were 29 instances for this benchmark, 0 of which were unsatisfiable. 15 teams submitted a solution for this benchmark. The 1 teams that did not participate were: ASPeRiX.
Result table for benchmark "TravellingSalesPerson" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 1 | 29 / 29 = 100% | 2 |
2 | IDP | 1 | 29 / 29 = 100% | 4 |
3 | Claspfolio | 1 | 29 / 29 = 100% | 44 |
4 | Smodels-IE | 1 | 29 / 29 = 100% | 75 |
5 | DLV | 1 | 29 / 29 = 100% | 520 |
6 | LP2DIFF+BCLT | 1 | 29 / 29 = 100% | 917 |
7 | LP2SAT+MINISAT | 0.86 | 25 / 29 = 86% | 7033 |
8 | sabe | 0.75 | 22 / 29 = 75% | 9550 |
9 | Cmodels | 0.75 | 22 / 29 = 75% | 12306 |
10 | bpsolver-CLP(FD) | 0.37 | 11 / 29 = 37% | 10895 |
11 | Sup | 0.27 | 8 / 29 = 27% | 16356 |
12 | pbmodels | 0.03 | 1 / 29 = 3% | 17088 |
13 | LP2DIFF+YICES | 0 | 0 / 29 = 0% | 17400 |
14 | amsolver | 0 | 0 / 29 = 0% | 17400 |
15 | Enfragmo | 0 | 0 / 29 = 0% | 17400 |
16 | ASPeRiX | 0 | 0 / 29 = 0% | 17545 |
2.2.7 Benchmark WeightBoundedDominatingSet
There were 29 instances for this benchmark, 0 of which were unsatisfiable. 14 teams submitted a solution for this benchmark. The 2 teams that did not participate were: ASPeRiX, amsolver.
Result table for benchmark "WeightBoundedDominatingSet" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 1 | 29 / 29 = 100% | 166 |
2 | LP2DIFF+YICES | 0.89 | 26 / 29 = 89% | 3606 |
3 | LP2SAT+MINISAT | 0.89 | 26 / 29 = 89% | 3910 |
4 | Sup | 0.86 | 25 / 29 = 86% | 3907 |
5 | IDP | 0.72 | 21 / 29 = 72% | 5912 |
6 | Cmodels | 0.68 | 20 / 29 = 68% | 7395 |
7 | pbmodels | 0.51 | 15 / 29 = 51% | 9278 |
8 | Enfragmo | 0.48 | 14 / 29 = 48% | 11160 |
9 | Claspfolio | 0.27 | 8 / 29 = 27% | 13626 |
10 | LP2DIFF+BCLT | 0.27 | 8 / 29 = 27% | 14481 |
11 | bpsolver-CLP(FD) | 0.24 | 7 / 29 = 24% | 13536 |
12 | DLV | 0.2 | 6 / 29 = 20% | 14972 |
13 | Smodels-IE | 0.1 | 3 / 29 = 10% | 15760 |
14 | sabe | 0 | 0 / 29 = 0% | 17400 |
15 | ASPeRiX | 0 | 0 / 29 = 0% | 17545 |
16 | amsolver | 0 | 0 / 29 = 0% | 17545 |
2.2.8 Benchmark Labyrinth
There were 29 instances for this benchmark, 0 of which were unsatisfiable. 10 teams submitted a solution for this benchmark. The 6 teams that did not participate were: ASPeRiX, bpsolver-CLP(FD), Enfragmo, pbmodels, sabe, amsolver.
After the competition, an error was discovered in the Asparagus encoding used
by all teams except DLV and IDP.
Result table for benchmark "Labyrinth" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 0.96 | 28 / 29 = 96% | 1758 |
2 | Claspfolio | 0.89 | 26 / 29 = 89% | 4554 |
3 | LP2SAT+MINISAT | 0.89 | 26 / 29 = 89% | 6184 |
4 | DLV | 0.86 | 25 / 29 = 86% | 2440 |
5 | Cmodels | 0.86 | 25 / 29 = 86% | 5288 |
6 | Sup | 0.68 | 20 / 29 = 68% | 7940 |
7 | Smodels-IE | 0.24 | 7 / 29 = 24% | 16804 |
8 | LP2DIFF+YICES | 0.17 | 5 / 29 = 17% | 15664 |
9 | IDP | 0.1 | 3 / 29 = 10% | 16332 |
10 | LP2DIFF+BCLT | 0 | 0 / 29 = 0% | 17400 |
11 | pbmodels | 0 | 0 / 29 = 0% | 17545 |
12 | bpsolver-CLP(FD) | 0 | 0 / 29 = 0% | 17545 |
13 | sabe | 0 | 0 / 29 = 0% | 17545 |
14 | ASPeRiX | 0 | 0 / 29 = 0% | 17545 |
15 | amsolver | 0 | 0 / 29 = 0% | 17545 |
16 | Enfragmo | 0 | 0 / 29 = 0% | 17545 |
2.2.9 Benchmark GeneralizedSlitherlink
There were 29 instances for this benchmark, 0 of which were unsatisfiable. 14 teams submitted a solution for this benchmark. The 2 teams that did not participate were: ASPeRiX, amsolver. 1 of the 29 instances were solved by all of the teams.
Result table for benchmark "GeneralizedSlitherLink" |
Place | Team | Score | Solved | Time (s) |
1 | Smodels-IE | 1 | 29 / 29 = 100% | 8 |
2 | Potassco | 1 | 29 / 29 = 100% | 9 |
3 | Sup | 1 | 29 / 29 = 100% | 15 |
4 | IDP | 1 | 29 / 29 = 100% | 17 |
5 | DLV | 1 | 29 / 29 = 100% | 26 |
6 | LP2DIFF+YICES | 1 | 29 / 29 = 100% | 30 |
7 | Claspfolio | 1 | 29 / 29 = 100% | 48 |
8 | Cmodels | 1 | 29 / 29 = 100% | 66 |
9 | LP2DIFF+BCLT | 1 | 29 / 29 = 100% | 123 |
10 | LP2SAT+MINISAT | 1 | 29 / 29 = 100% | 171 |
11 | Enfragmo | 1 | 29 / 29 = 100% | 813 |
12 | bpsolver-CLP(FD) | 0.68 | 20 / 29 = 68% | 5841 |
13 | pbmodels | 0.03 | 1 / 29 = 3% | 16829 |
14 | sabe | 0.03 | 1 / 29 = 3% | 16844 |
15 | ASPeRiX | 0 | 0 / 29 = 0% | 17545 |
16 | amsolver | 0 | 0 / 29 = 0% | 17545 |
2.2.10 Benchmark HierarchicalClustering
There were 12 instances for this benchmark, 4 of which were unsatisfiable. 15 teams submitted a solution for this benchmark. The 1 teams that did not participate were: ASPeRiX. 1 of the 12 instances were solved by all of the teams.
Result table for benchmark "HierarchicalClustering" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 1 | 12 / 12 = 100% | 2 |
2 | IDP | 1 | 12 / 12 = 100% | 4 |
3 | Sup | 1 | 12 / 12 = 100% | 5 |
4 | DLV | 1 | 12 / 12 = 100% | 9 |
5 | Cmodels | 1 | 12 / 12 = 100% | 11 |
6 | Claspfolio | 1 | 12 / 12 = 100% | 17 |
7 | LP2SAT+MINISAT | 1 | 12 / 12 = 100% | 17 |
8 | LP2DIFF+YICES | 1 | 12 / 12 = 100% | 27 |
9 | Enfragmo | 1 | 12 / 12 = 100% | 60 |
10 | LP2DIFF+BCLT | 1 | 12 / 12 = 100% | 256 |
11 | Smodels-IE | 0.83 | 10 / 12 = 83% | 1744 |
12 | bpsolver-CLP(FD) | 0.25 | 3 / 12 = 25% | 5565 |
13 | pbmodels | 0.08 | 1 / 12 = 8% | 6817 |
14 | sabe | 0.08 | 1 / 12 = 8% | 6831 |
15 | amsolver | 0.08 | 1 / 12 = 8% | 7136 |
16 | ASPeRiX | 0 | 0 / 12 = 0% | 7260 |
2.2.11 Benchmark ConnectedDominatingSet
There were 21 instances for this benchmark, 11 of which were unsatisfiable. 15 teams submitted a solution for this benchmark. The 1 teams that did not participate were: ASPeRiX.
Result table for benchmark "ConnectedDominatingSet" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 1 | 21 / 21 = 100% | 39 |
2 | LP2DIFF+YICES | 1 | 21 / 21 = 100% | 128 |
3 | LP2DIFF+BCLT | 1 | 21 / 21 = 100% | 169 |
4 | LP2SAT+MINISAT | 1 | 21 / 21 = 100% | 316 |
5 | Sup | 1 | 21 / 21 = 100% | 465 |
6 | Cmodels | 1 | 21 / 21 = 100% | 535 |
7 | Claspfolio | 0.95 | 20 / 21 = 95% | 864 |
8 | Enfragmo | 0.95 | 20 / 21 = 95% | 3260 |
9 | IDP | 0.9 | 19 / 21 = 90% | 1589 |
10 | pbmodels | 0.9 | 19 / 21 = 90% | 2551 |
11 | Smodels-IE | 0.71 | 15 / 21 = 71% | 3943 |
12 | bpsolver-CLP(FD) | 0.61 | 13 / 21 = 61% | 5562 |
13 | DLV | 0.61 | 13 / 21 = 61% | 6242 |
14 | amsolver | 0.14 | 3 / 21 = 14% | 11067 |
15 | sabe | 0 | 0 / 21 = 0% | 12600 |
16 | ASPeRiX | 0 | 0 / 21 = 0% | 12705 |
2.2.12 Benchmark GraphColouring
There were 29 instances for this benchmark, 19 of which were unsatisfiable. 15 teams submitted a solution for this benchmark. The 1 teams that did not participate were: ASPeRiX. 6 of the 29 instances were solved by all of the teams. 1 of the instances were solved by no team.
Result table for benchmark "GraphColouring" |
Place | Team | Score | Solved | Time (s) |
1 | Enfragmo | 1 | 28 / 29 = 96% | 3305 |
2 | Potassco | 0.6 | 17 / 29 = 58% | 9561 |
3 | LP2DIFF+YICES | 0.42 | 12 / 29 = 41% | 11541 |
4 | bpsolver-CLP(FD) | 0.39 | 11 / 29 = 37% | 12264 |
5 | IDP | 0.35 | 10 / 29 = 34% | 11923 |
6 | Cmodels | 0.35 | 10 / 29 = 34% | 12287 |
7 | LP2SAT+MINISAT | 0.32 | 9 / 29 = 31% | 12806 |
8 | amsolver | 0.32 | 9 / 29 = 31% | 12884 |
9 | Claspfolio | 0.32 | 9 / 29 = 31% | 13016 |
10 | Sup | 0.28 | 8 / 29 = 27% | 12755 |
11 | Smodels-IE | 0.28 | 8 / 29 = 27% | 12794 |
12 | sabe | 0.28 | 8 / 29 = 27% | 13029 |
13 | DLV | 0.28 | 8 / 29 = 27% | 13366 |
14 | pbmodels | 0.25 | 7 / 29 = 24% | 13807 |
15 | LP2DIFF+BCLT | 0.21 | 6 / 29 = 20% | 14195 |
16 | ASPeRiX | 0 | 0 / 29 = 0% | 17545 |
2.2.13 Benchmark Hanoi
There were 15 instances for this benchmark, 0 of which were unsatisfiable. 15 teams submitted a solution for this benchmark. The 1 teams that did not participate were: amsolver. 1 of the 15 instances were solved by all of the teams.
Result table for benchmark "Hanoi" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 1 | 15 / 15 = 100% | 39 |
2 | Cmodels | 1 | 15 / 15 = 100% | 59 |
3 | Sup | 1 | 15 / 15 = 100% | 149 |
4 | Claspfolio | 1 | 15 / 15 = 100% | 182 |
5 | LP2SAT+MINISAT | 1 | 15 / 15 = 100% | 184 |
6 | IDP | 1 | 15 / 15 = 100% | 195 |
7 | DLV | 1 | 15 / 15 = 100% | 229 |
8 | ASPeRiX | 1 | 15 / 15 = 100% | 760 |
9 | LP2DIFF+BCLT | 1 | 15 / 15 = 100% | 1073 |
10 | Enfragmo | 0.93 | 14 / 15 = 93% | 1442 |
11 | LP2DIFF+YICES | 0.73 | 11 / 15 = 73% | 2534 |
12 | pbmodels | 0.2 | 3 / 15 = 20% | 7770 |
13 | sabe | 0.13 | 2 / 15 = 13% | 7828 |
14 | Smodels-IE | 0.13 | 2 / 15 = 13% | 7869 |
15 | bpsolver-CLP(FD) | 0.06 | 1 / 15 = 6% | 8430 |
16 | amsolver | 0 | 0 / 15 = 0% | 9075 |
2.2.14 Benchmark FastFood
There were 29 instances for this benchmark, 19 of which were unsatisfiable. 14 teams submitted a solution for this benchmark. The 2 teams that did not participate were: ASPeRiX, amsolver.
Result table for benchmark "Fastfood" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 1 | 29 / 29 = 100% | 33 |
2 | bpsolver-CLP(FD) | 1 | 29 / 29 = 100% | 35 |
3 | Claspfolio | 1 | 29 / 29 = 100% | 95 |
4 | LP2DIFF+YICES | 1 | 29 / 29 = 100% | 247 |
5 | DLV | 1 | 29 / 29 = 100% | 657 |
6 | Smodels-IE | 1 | 29 / 29 = 100% | 941 |
7 | Enfragmo | 1 | 29 / 29 = 100% | 1643 |
8 | IDP | 0.96 | 28 / 29 = 96% | 4276 |
9 | LP2SAT+MINISAT | 0.93 | 27 / 29 = 93% | 3739 |
10 | LP2DIFF+BCLT | 0.89 | 26 / 29 = 89% | 6637 |
11 | Sup | 0.55 | 16 / 29 = 55% | 8445 |
12 | Cmodels | 0.55 | 16 / 29 = 55% | 8489 |
13 | pbmodels | 0 | 0 / 29 = 0% | 17400 |
14 | sabe | 0 | 0 / 29 = 0% | 17400 |
15 | ASPeRiX | 0 | 0 / 29 = 0% | 17545 |
16 | amsolver | 0 | 0 / 29 = 0% | 17545 |
2.2.15 Benchmark WireRouting
There were 23 instances for this benchmark, 11 of which were unsatisfiable. 13 teams submitted a solution for this benchmark. The 3 teams that did not participate were: ASPeRiX, Enfragmo, amsolver.
Result table for benchmark "WireRouting" |
Place | Team | Score | Solved | Time (s) |
1 | Claspfolio | 1 | 23 / 23 = 100% | 378 |
2 | Potassco | 0.95 | 22 / 23 = 95% | 1038 |
3 | IDP | 0.86 | 20 / 23 = 86% | 2958 |
4 | sabe | 0.78 | 18 / 23 = 78% | 8507 |
5 | Cmodels | 0.73 | 17 / 23 = 73% | 4891 |
6 | pbmodels | 0.73 | 17 / 23 = 73% | 8278 |
7 | Sup | 0.69 | 16 / 23 = 69% | 4929 |
8 | LP2DIFF+YICES | 0.43 | 10 / 23 = 43% | 9491 |
9 | LP2DIFF+BCLT | 0.39 | 9 / 23 = 39% | 9745 |
10 | LP2SAT+MINISAT | 0.3 | 7 / 23 = 30% | 10326 |
11 | DLV | 0.26 | 6 / 23 = 26% | 10639 |
12 | Smodels-IE | 0.21 | 5 / 23 = 21% | 11326 |
13 | bpsolver-CLP(FD) | 0 | 0 / 23 = 0% | 13800 |
14 | ASPeRiX | 0 | 0 / 23 = 0% | 13915 |
15 | amsolver | 0 | 0 / 23 = 0% | 13915 |
16 | Enfragmo | 0 | 0 / 23 = 0% | 13915 |
2.2.16 Benchmark Sudoku
There were 10 instances for this benchmark, 0 of which were unsatisfiable. 16 teams submitted a solution for this benchmark. Of the participating teams, 1(amsolver) were disqualified.
Result table for benchmark "Sudoku" |
Place | Team | Score | Solved | Time (s) |
1 | IDP | 1 | 10 / 10 = 100% | 2 |
2 | Potassco | 1 | 10 / 10 = 100% | 3 |
3 | Smodels-IE | 1 | 10 / 10 = 100% | 3 |
4 | LP2SAT+MINISAT | 1 | 10 / 10 = 100% | 8 |
5 | Sup | 1 | 10 / 10 = 100% | 11 |
6 | LP2DIFF+YICES | 1 | 10 / 10 = 100% | 13 |
7 | Enfragmo | 1 | 10 / 10 = 100% | 15 |
8 | Claspfolio | 1 | 10 / 10 = 100% | 17 |
9 | Cmodels | 1 | 10 / 10 = 100% | 31 |
10 | bpsolver-CLP(FD) | 1 | 10 / 10 = 100% | 56 |
11 | DLV | 1 | 10 / 10 = 100% | 75 |
12 | LP2DIFF+BCLT | 1 | 10 / 10 = 100% | 148 |
13 | pbmodels | 1 | 10 / 10 = 100% | 1301 |
14 | sabe | 1 | 10 / 10 = 100% | 1354 |
15 | ASPeRiX | 0.2 | 2 / 10 = 20% | 4804 |
16 | amsolver | 0 | 0 / 10 = 0% | 6050 |
2.2.17 Benchmark DisjunctiveScheduling
There were 10 instances for this benchmark, 0 of which were unsatisfiable. 13 teams submitted a solution for this benchmark. The 3 teams that did not participate were: pbmodels, sabe, amsolver.
Result table for benchmark "DisjunctiveScheduling" |
Place | Team | Score | Solved | Time (s) |
1 | bpsolver-CLP(FD) | 1 | 10 / 10 = 100% | 2 |
2 | Potassco | 1 | 10 / 10 = 100% | 121 |
3 | ASPeRiX | 1 | 10 / 10 = 100% | 356 |
4 | Enfragmo | 1 | 10 / 10 = 100% | 474 |
5 | IDP | 1 | 10 / 10 = 100% | 1056 |
6 | Sup | 0.5 | 5 / 10 = 50% | 3042 |
7 | LP2SAT+MINISAT | 0.5 | 5 / 10 = 50% | 3066 |
8 | Claspfolio | 0.5 | 5 / 10 = 50% | 3109 |
9 | Smodels-IE | 0.5 | 5 / 10 = 50% | 3532 |
10 | DLV | 0.5 | 5 / 10 = 50% | 3622 |
11 | Cmodels | 0.4 | 4 / 10 = 40% | 4481 |
12 | LP2DIFF+YICES | 0 | 0 / 10 = 0% | 6000 |
13 | LP2DIFF+BCLT | 0 | 0 / 10 = 0% | 6000 |
14 | pbmodels | 0 | 0 / 10 = 0% | 6050 |
15 | sabe | 0 | 0 / 10 = 0% | 6050 |
16 | amsolver | 0 | 0 / 10 = 0% | 6050 |
2.2.18 Benchmark KnightTour
There were 10 instances for this benchmark, 0 of which were unsatisfiable. 14 teams submitted a solution for this benchmark. The 2 teams that did not participate were: ASPeRiX, amsolver.
Result table for benchmark "KnightTour" |
Place | Team | Score | Solved | Time (s) |
1 | bpsolver-CLP(FD) | 1 | 10 / 10 = 100% | 7 |
2 | Potassco | 1 | 10 / 10 = 100% | 18 |
3 | Claspfolio | 1 | 10 / 10 = 100% | 43 |
4 | Smodels-IE | 0.7 | 7 / 10 = 70% | 1838 |
5 | Cmodels | 0.6 | 6 / 10 = 60% | 2405 |
6 | DLV | 0.6 | 6 / 10 = 60% | 2488 |
7 | IDP | 0.5 | 5 / 10 = 50% | 3009 |
8 | Sup | 0.5 | 5 / 10 = 50% | 3253 |
9 | Enfragmo | 0.3 | 3 / 10 = 30% | 4828 |
10 | LP2DIFF+BCLT | 0.2 | 2 / 10 = 20% | 4938 |
11 | LP2SAT+MINISAT | 0.2 | 2 / 10 = 20% | 4965 |
12 | LP2DIFF+YICES | 0.1 | 1 / 10 = 10% | 5400 |
13 | pbmodels | 0 | 0 / 10 = 0% | 6000 |
14 | sabe | 0 | 0 / 10 = 0% | 6000 |
15 | ASPeRiX | 0 | 0 / 10 = 0% | 6050 |
16 | amsolver | 0 | 0 / 10 = 0% | 6050 |
2.2.19 Benchmark ChannelRouting
There were 11 instances for this benchmark, 2 of which were unsatisfiable. 14 teams submitted a solution for this benchmark. The 2 teams that did not participate were: Enfragmo, amsolver. 5 of the 11 instances were solved by all of the teams. 3 of the instances were solved by no team.
Result table for benchmark "ChannelRouting" |
Place | Team | Score | Solved | Time (s) |
1 | bpsolver-CLP(FD) | 1 | 8 / 11 = 72% | 1801 |
2 | IDP | 1 | 8 / 11 = 72% | 1802 |
3 | Potassco | 1 | 8 / 11 = 72% | 1802 |
4 | Sup | 1 | 8 / 11 = 72% | 1804 |
5 | Cmodels | 1 | 8 / 11 = 72% | 1807 |
6 | Smodels-IE | 1 | 8 / 11 = 72% | 1807 |
7 | LP2SAT+MINISAT | 1 | 8 / 11 = 72% | 1809 |
8 | Claspfolio | 1 | 8 / 11 = 72% | 1814 |
9 | LP2DIFF+YICES | 1 | 8 / 11 = 72% | 1817 |
10 | pbmodels | 1 | 8 / 11 = 72% | 1995 |
11 | sabe | 1 | 8 / 11 = 72% | 2015 |
12 | LP2DIFF+BCLT | 1 | 8 / 11 = 72% | 2613 |
13 | DLV | 0.87 | 7 / 11 = 63% | 2416 |
14 | ASPeRiX | 0.62 | 5 / 11 = 45% | 3606 |
15 | amsolver | 0 | 0 / 11 = 0% | 6655 |
16 | Enfragmo | 0 | 0 / 11 = 0% | 6655 |
2.2.20 Benchmark EdgeMatching
There were 29 instances for this benchmark, 0 of which were unsatisfiable. 14 teams submitted a solution for this benchmark. The 2 teams that did not participate were: ASPeRiX, amsolver.
After the competition, an error was discovered in the Asparagus and
DLV encoding. The Asparagus encoding was used by all teams except
bpsolver-CLP(FD), IDP and Enfragmo.
Result table for benchmark "Edgematching" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 1 | 29 / 29 = 100% | 28 |
2 | Sup | 1 | 29 / 29 = 100% | 50 |
3 | Claspfolio | 1 | 29 / 29 = 100% | 83 |
4 | LP2SAT+MINISAT | 1 | 29 / 29 = 100% | 100 |
5 | Smodels-IE | 1 | 29 / 29 = 100% | 996 |
6 | pbmodels | 1 | 29 / 29 = 100% | 1754 |
7 | Cmodels | 1 | 29 / 29 = 100% | 1793 |
8 | sabe | 1 | 29 / 29 = 100% | 2221 |
9 | DLV | 1 | 29 / 29 = 100% | 2546 |
10 | LP2DIFF+BCLT | 0.68 | 20 / 29 = 68% | 11920 |
11 | LP2DIFF+YICES | 0.48 | 14 / 29 = 48% | 9067 |
12 | bpsolver-CLP(FD) | 0.13 | 4 / 29 = 13% | 15119 |
13 | IDP | 0.13 | 4 / 29 = 13% | 15291 |
14 | Enfragmo | 0 | 0 / 29 = 0% | 17400 |
15 | ASPeRiX | 0 | 0 / 29 = 0% | 17545 |
16 | amsolver | 0 | 0 / 29 = 0% | 17545 |
2.2.21 Benchmark GraphPartitioning
There were 13 instances for this benchmark, 7 of which were unsatisfiable. 14 teams submitted a solution for this benchmark. The 2 teams that did not participate were: ASPeRiX, amsolver. Of the participating teams, 1(sabe) were disqualified.1 of the 13 instances were solved by all of the teams.
Result table for benchmark "GraphPartitioning" |
Place | Team | Score | Solved | Time (s) |
1 | Potassco | 1 | 13 / 13 = 100% | 78 |
2 | IDP | 1 | 13 / 13 = 100% | 107 |
3 | Cmodels | 1 | 13 / 13 = 100% | 386 |
4 | Claspfolio | 0.92 | 12 / 13 = 92% | 937 |
5 | Sup | 0.92 | 12 / 13 = 92% | 964 |
6 | DLV | 0.84 | 11 / 13 = 84% | 1457 |
7 | Enfragmo | 0.84 | 11 / 13 = 84% | 2822 |
8 | Smodels-IE | 0.76 | 10 / 13 = 76% | 2457 |
9 | LP2DIFF+YICES | 0.69 | 9 / 13 = 69% | 2852 |
10 | LP2DIFF+BCLT | 0.69 | 9 / 13 = 69% | 3543 |
11 | pbmodels | 0.61 | 8 / 13 = 61% | 3821 |
12 | LP2SAT+MINISAT | 0.38 | 5 / 13 = 38% | 4919 |
13 | bpsolver-CLP(FD) | 0.3 | 4 / 13 = 30% | 5912 |
14 | sabe | 0 | 0 / 13 = 0% | 7865 |
15 | ASPeRiX | 0 | 0 / 13 = 0% | 7865 |
16 | amsolver | 0 | 0 / 13 = 0% | 7865 |
2.2.22 Benchmark MazeGeneration
There were 29 instances for this benchmark, 19 of which were unsatisfiable. 13 teams submitted a solution for this benchmark. The 3 teams that did not participate were: ASPeRiX, Enfragmo, amsolver. Of the participating teams, 1(pbmodels) were disqualified.
Result table for benchmark "MazeGeneration" |
Place | Team | Score | Solved | Time (s) |
1 | DLV | 1 | 29 / 29 = 100% | 13 |
2 | Potassco | 0.93 | 27 / 29 = 93% | 1623 |
3 | LP2DIFF+BCLT | 0.93 | 27 / 29 = 93% | 1946 |
4 | LP2DIFF+YICES | 0.86 | 25 / 29 = 86% | 2559 |
5 | Cmodels | 0.86 | 25 / 29 = 86% | 2646 |
6 | Sup | 0.86 | 25 / 29 = 86% | 2815 |
7 | IDP | 0.86 | 25 / 29 = 86% | 2972 |
8 | Claspfolio | 0.86 | 25 / 29 = 86% | 3377 |
9 | LP2SAT+MINISAT | 0.82 | 24 / 29 = 82% | 5350 |
10 | Smodels-IE | 0.51 | 15 / 29 = 51% | 8552 |
11 | bpsolver-CLP(FD) | 0.48 | 14 / 29 = 48% | 9007 |
12 | sabe | 0 | 0 / 29 = 0% | 17400 |
13 | pbmodels | 0 | 0 / 29 = 0% | 17545 |
14 | ASPeRiX | 0 | 0 / 29 = 0% | 17545 |
15 | amsolver | 0 | 0 / 29 = 0% | 17545 |
16 | Enfragmo | 0 | 0 / 29 = 0% | 17545 |
2.2.23 Benchmark Solitaire
There were 27 instances for this benchmark, 0 of which were unsatisfiable. 14 teams submitted a solution for this benchmark. The 2 teams that did not participate were: ASPeRiX, amsolver. 4 of the 27 instances were solved by all of the teams. 4 of the instances were solved by no team.
Result table for benchmark "Solitaire" |
Place | Team | Score | Solved | Time (s) |
1 | DLV | 0.91 | 21 / 27 = 77% | 3785 |
2 | bpsolver-CLP(FD) | 0.91 | 21 / 27 = 77% | 3815 |
3 | Potassco | 0.91 | 21 / 27 = 77% | 3884 |
4 | Cmodels | 0.91 | 21 / 27 = 77% | 3893 |
5 | Claspfolio | 0.91 | 21 / 27 = 77% | 3964 |
6 | Sup | 0.86 | 20 / 27 = 74% | 4529 |
7 | pbmodels | 0.86 | 20 / 27 = 74% | 4542 |
8 | LP2SAT+MINISAT | 0.86 | 20 / 27 = 74% | 4877 |
9 | LP2DIFF+YICES | 0.82 | 19 / 27 = 70% | 5173 |
10 | IDP | 0.82 | 19 / 27 = 70% | 5520 |
11 | sabe | 0.78 | 18 / 27 = 66% | 5828 |
12 | LP2DIFF+BCLT | 0.78 | 18 / 27 = 66% | 5917 |
13 | Enfragmo | 0.73 | 17 / 27 = 62% | 6318 |
14 | Smodels-IE | 0.17 | 4 / 27 = 14% | 14391 |
15 | ASPeRiX | 0 | 0 / 27 = 0% | 16335 |
16 | amsolver | 0 | 0 / 27 = 0% | 16335 |
2.3 NP^NP problems
2.3.1 Benchmark StrategicCompanies
There were 17 instances for this benchmark, 1 of which were unsatisfiable. 5 teams submitted a solution for this benchmark. The 11 teams that did not participate were: Claspfolio, Smodels-IE, ASPeRiX, Sup, Enfragmo, LP2DIFF+BCLT, LP2SAT+MINISAT, LP2DIFF+YICES, pbmodels, sabe, amsolver.
Result table for benchmark "Strategiccompanies" |
Place | Team | Score | Solved | Time (s) |
1 | DLV | 1 | 17 / 17 = 100% | 3091 |
2 | Potassco | 0.29 | 5 / 17 = 29% | 7618 |
3 | Cmodels | 0.23 | 4 / 17 = 23% | 8365 |
4 | bpsolver-CLP(FD) | 0.11 | 2 / 17 = 11% | 9000 |
5 | IDP | 0 | 0 / 17 = 0% | 10200 |
6 | Sup | 0 | 0 / 17 = 0% | 10285 |
7 | pbmodels | 0 | 0 / 17 = 0% | 10285 |
8 | LP2SAT+MINISAT | 0 | 0 / 17 = 0% | 10285 |
9 | Claspfolio | 0 | 0 / 17 = 0% | 10285 |
10 | LP2DIFF+YICES | 0 | 0 / 17 = 0% | 10285 |
11 | LP2DIFF+BCLT | 0 | 0 / 17 = 0% | 10285 |
12 | sabe | 0 | 0 / 17 = 0% | 10285 |
13 | Smodels-IE | 0 | 0 / 17 = 0% | 10285 |
14 | ASPeRiX | 0 | 0 / 17 = 0% | 10285 |
15 | amsolver | 0 | 0 / 17 = 0% | 10285 |
16 | Enfragmo | 0 | 0 / 17 = 0% | 10285 |
2.4 Optimization problems
2.4.1 Benchmark 15PuzzleOpt
There were 16 instances for this benchmark, 0 of which were unsatisfiable. 7 teams submitted a solution for this benchmark. The 9 teams that did not participate were: ASPeRiX, Cmodels, Sup, Enfragmo, LP2DIFF+BCLT, LP2SAT+MINISAT, LP2DIFF+YICES, pbmodels, amsolver.
Number of times OPTIMUM FOUND was produced |
Team | Number |
Potassco | 12 |
IDP | 14 |
Claspfolio | 9
|
Result table for benchmark "15PuzzleOpt" |
Place | Team | Score | Time (s) |
1 | Potassco | 1441.66 | 4504 |
2 | IDP | 1425 | 3311 |
3 | Claspfolio | 1291.66 | 6288 |
4 | bpsolver-CLP(FD) | 900 | 8427 |
5 | sabe | 600 | 3575 |
6 | DLV | 191.66 | 9600 |
7 | Smodels-IE | 0 | 9600 |
8 | Sup | 0 | 9680 |
9 | pbmodels | 0 | 9680 |
10 | LP2SAT+MINISAT | 0 | 9680 |
11 | LP2DIFF+YICES | 0 | 9680 |
12 | LP2DIFF+BCLT | 0 | 9680 |
13 | ASPeRiX | 0 | 9680 |
14 | amsolver | 0 | 9680 |
15 | Cmodels | 0 | 9680 |
16 | Enfragmo | 0 | 9680 |
2.4.2 Benchmark CompanyControlsOptimize
There were 15 instances for this benchmark, 4 of which were unsatisfiable. 12 teams submitted a solution for this benchmark. The 4 teams that did not participate were: ASPeRiX, pbmodels, sabe, amsolver. The number of times OPTIMUM FOUND was produced per team was as follows:
Number of times OPTIMUM FOUND was produced |
Team | Number |
Potassco | 15 |
DLV | 15 |
Smodels-IE | 15 |
Claspfolio | 15
|
Result table for benchmark "CompanyControlsOptimize" |
Place | Team | Score | Time (s) |
1 | Potassco | 1500 | 112 |
2 | Claspfolio | 1500 | 195 |
3 | DLV | 1500 | 1719 |
4 | Smodels-IE | 1500 | 3301 |
5 | IDP | 0 | 9000 |
6 | bpsolver-CLP(FD) | 0 | 9000 |
7 | Sup | 0 | 9075 |
8 | pbmodels | 0 | 9075 |
9 | LP2SAT+MINISAT | 0 | 9075 |
10 | LP2DIFF+YICES | 0 | 9075 |
11 | LP2DIFF+BCLT | 0 | 9075 |
12 | sabe | 0 | 9075 |
13 | ASPeRiX | 0 | 9075 |
14 | amsolver | 0 | 9075 |
15 | Cmodels | 0 | 9075 |
16 | Enfragmo | 0 | 9075 |
2.4.3 Benchmark FastfoodOptimize
There were 29 instances for this benchmark, 0 of which were unsatisfiable. 6 teams submitted a solution for this benchmark. The 10 teams that did not participate were: ASPeRiX, Cmodels, Sup, Enfragmo, LP2DIFF+BCLT, LP2SAT+MINISAT, LP2DIFF+YICES, pbmodels, sabe, amsolver. 29 of the 29 instances were solved by all of the teams.
Number of times OPTIMUM FOUND was produced |
Team | Number |
Potassco | 21 |
DLV | 24 |
Smodels-IE | 17 |
IDP | 3 |
bpsolver-CLP(FD) | 19 |
Claspfolio | 29
|
Result table for benchmark "FastfoodOptimize" |
Place | Team | Score | Time (s) |
1 | Claspfolio | 2900 | 1466 |
2 | Potassco | 2875 | 1560 |
3 | DLV | 2769.65 | 7333 |
4 | Smodels-IE | 2403.02 | 9338 |
5 | bpsolver-CLP(FD) | 2361.03 | 10035 |
6 | IDP | 1332.38 | 16238 |
7 | Sup | 0 | 17545 |
8 | pbmodels | 0 | 17545 |
9 | LP2SAT+MINISAT | 0 | 17545 |
10 | LP2DIFF+YICES | 0 | 17545 |
11 | LP2DIFF+BCLT | 0 | 17545 |
12 | sabe | 0 | 17545 |
13 | ASPeRiX | 0 | 17545 |
14 | amsolver | 0 | 17545 |
15 | Cmodels | 0 | 17545 |
16 | Enfragmo | 0 | 17545 |
2.4.4 Benchmark GolombRuler
There were 24 instances for this benchmark, 0 of which were unsatisfiable. 8 teams submitted a solution for this benchmark. The 8 teams that did not participate were: ASPeRiX, Cmodels, Sup, Enfragmo, LP2DIFF+BCLT, LP2SAT+MINISAT, LP2DIFF+YICES, amsolver. Of the participating teams, 1(pbmodels) were disqualified.8 of the 24 instances were solved by all of the teams.
Number of times OPTIMUM FOUND was produced |
Team | Number |
DLV | 9 |
IDP | 6 |
bpsolver-CLP(FD) | 15
|
Result table for benchmark "GolombRuler" |
Place | Team | Score | Time (s) |
1 | bpsolver-CLP(FD) | 2032.01 | 7299 |
2 | DLV | 1568.99 | 9638 |
3 | Potassco | 1282.68 | 14400 |
4 | IDP | 1282.68 | 11023 |
5 | Claspfolio | 1031.62 | 14399 | 6 | Smodels-IE | 455.22 | 14400 |
7 | sabe | 0 | 8240 |
8 | Sup | 0 | 14520 |
9 | pbmodels | 0 | 14520 |
10 | LP2SAT+MINISAT | 0 | 14520 |
11 | LP2DIFF+YICES | 0 | 14520 |
12 | LP2DIFF+BCLT | 0 | 14520 |
13 | ASPeRiX | 0 | 14520 |
14 | amsolver | 0 | 14520 |
15 | Cmodels | 0 | 14520 |
16 | Enfragmo | 0 | 14520 |
2.4.5 Benchmark LabyrinthOptimize
There were 29 instances for this benchmark, 0 of which were unsatisfiable. 5 teams submitted a solution for this benchmark. The 11 teams that did not participate were: ASPeRiX, Cmodels, Sup, bpsolver-CLP(FD), Enfragmo, LP2DIFF+BCLT, LP2SAT+MINISAT, LP2DIFF+YICES, pbmodels, sabe, amsolver. Of the participating teams, 1(IDP) were disqualified.12 of the 29 instances were solved by all of the teams.
After the competition, an error was discovered in the Asparagus encoding used by Potassco, Claspfolio and Smodels-IE.
Number of times OPTIMUM FOUND was produced |
Team | Number |
Potassco | 14 |
DLV | 6 |
Smodels-IE | 8 |
Claspfolio | 10
|
Result table for benchmark "LabyrinthOptimize" |
Place | Team | Score | Time (s) |
1 | Potassco | 2337.26 | 10460 |
2 | DLV | 2081.86 | 14225 |
3 | Claspfolio | 1703.33 | 12541 |
4 | Smodels-IE | 1444.04 | 13732 |
5 | IDP | 0 | 17545 |
6 | Sup | 0 | 17545 |
7 | pbmodels | 0 | 17545 |
8 | LP2SAT+MINISAT | 0 | 17545 |
9 | LP2DIFF+YICES | 0 | 17545 |
10 | bpsolver-CLP(FD) | 0 | 17545 |
11 | LP2DIFF+BCLT | 0 | 17545 |
12 | sabe | 0 | 17545 |
13 | ASPeRiX | 0 | 17545 |
14 | amsolver | 0 | 17545 |
15 | Cmodels | 0 | 17545 |
16 | Enfragmo | 0 | 17545 |
2.4.6 Benchmark MaximalClique
There were 29 instances for this benchmark, 0 of which were unsatisfiable. 7 teams submitted a solution for this benchmark. The 9 teams that did not participate were: ASPeRiX, Cmodels, Sup, LP2DIFF+BCLT, LP2SAT+MINISAT, LP2DIFF+YICES, pbmodels, sabe, amsolver. 29 of the 29 instances were solved by all of the teams.
Number of times OPTIMUM FOUND was produced |
Team | Number |
Potassco | 15 |
DLV | 15 |
Smodels-IE | 15 |
IDP | 15 |
bpsolver-CLP(FD) | 15 |
Claspfolio | 15
|
Result table for benchmark "MaximalClique" |
Place | Team | Score | Time (s) |
1 | Potassco | 2550 | 8413 |
2 | DLV | 2491.16 | 8426 |
3 | Claspfolio | 2463.05 | 8430 |
4 | Smodels-IE | 2398.33 | 8487 |
5 | IDP | 2244.44 | 9027 |
6 | bpsolver-CLP(FD) | 1850 | 9487 |
7 | Enfragmo | 1325.11 | 1203 |
8 | Sup | 0 | 17545 |
9 | pbmodels | 0 | 17545 |
10 | LP2SAT+MINISAT | 0 | 17545 |
11 | LP2DIFF+YICES | 0 | 17545 |
12 | LP2DIFF+BCLT | 0 | 17545 |
13 | sabe | 0 | 17545 |
14 | ASPeRiX | 0 | 17545 |
15 | amsolver | 0 | 17545 |
16 | Cmodels | 0 | 17545 |
2.4.7 Benchmark SokobanOptimize
There were 28 instances for this benchmark, 3 of which were unsatisfiable. 7 teams submitted a solution for this benchmark. The 9 teams that did not participate were: ASPeRiX, Cmodels, Sup, bpsolver-CLP(FD), Enfragmo, LP2DIFF+BCLT, LP2SAT+MINISAT, LP2DIFF+YICES, amsolver. 4 of the 28 instances were solved by all of the teams.
Number of times OPTIMUM FOUND was produced |
Team | Number |
Potassco | 25 |
DLV | 24 |
Smodels-IE | 14 |
IDP | 25 |
Claspfolio | 25
|
Result table for benchmark "SokobanOptimize" |
Place | Team | Score | Time (s) |
1 | Potassco | 2800 | 66 |
2 | IDP | 2800 | 115 |
3 | Claspfolio | 2800 | 210 |
4 | DLV | 2400 | 7057 |
5 | Smodels-IE | 1600 | 9516 |
6 | sabe | 650 | 8073 |
7 | pbmodels | 300 | 15003 |
8 | Sup | 0 | 16940 |
9 | LP2SAT+MINISAT | 0 | 16940 |
10 | LP2DIFF+YICES | 0 | 16940 |
11 | bpsolver-CLP(FD) | 0 | 16940 |
12 | LP2DIFF+BCLT | 0 | 16940 |
13 | ASPeRiX | 0 | 16940 |
14 | amsolver | 0 | 16940 |
15 | Cmodels | 0 | 16940 |
16 | Enfragmo | 0 | 16940 |
2.4.8 Benchmark TravellingSalespersonOptimize
There were 29 instances for this benchmark, 0 of which were unsatisfiable. 6 teams submitted a solution for this benchmark. The 10 teams that did not participate were: ASPeRiX, Cmodels, Sup, Enfragmo, LP2DIFF+BCLT, LP2SAT+MINISAT, LP2DIFF+YICES, pbmodels, sabe, amsolver. 11 of the 29 instances were solved by all of the teams.
Result table for benchmark "TravellingSalesPersonOptimize" |
Place | Team | Score | Time (s) |
1 | Potassco | 1771.8 | 17400 |
2 | Smodels-IE | 1578.84 | 17400 |
3 | IDP | 1554.3 | 17400 |
4 | Claspfolio | 1283.32 | 17400 |
5 | bpsolver-CLP(FD) | 825 | 17400 |
6 | DLV | 768.29 | 17400 |
7 | Sup | 0 | 17545 |
8 | pbmodels | 0 | 17545 |
9 | LP2SAT+MINISAT | 0 | 17545 |
10 | LP2DIFF+YICES | 0 | 17545 |
11 | LP2DIFF+BCLT | 0 | 17545 |
12 | sabe | 0 | 17545 |
13 | ASPeRiX | 0 | 17545 |
14 | amsolver | 0 | 17545 |
15 | Cmodels | 0 | 17545 |
16 | Enfragmo | 0 | 17545 |
2.4.9 Benchmark WeightBoundedDominatingSetOptimize
There were 29 instances for this benchmark, 0 of which were unsatisfiable. 6 teams submitted a solution for this benchmark. The 10 teams that did not participate were: ASPeRiX, Cmodels, Sup, Enfragmo, LP2DIFF+BCLT, LP2SAT+MINISAT, LP2DIFF+YICES, pbmodels, sabe, amsolver. 2 of the 29 instances were solved by all of the teams. 4 of the instances were solved by no team.
Result table for benchmark "WeightBoundedDominatingSetOptimize" |
Place | Team | Score | Time (s) |
1 | Potassco | 1675 | 17400 |
2 | IDP | 1113.33 | 17400 |
3 | Claspfolio | 361.66 | 17400 |
4 | bpsolver-CLP(FD) | 215 | 17400 |
5 | DLV | 150 | 17400 |
6 | Smodels-IE | 50 | 17400 |
7 | Sup | 0 | 17545 |
8 | pbmodels | 0 | 17545 |
9 | LP2SAT+MINISAT | 0 | 17545 |
10 | LP2DIFF+YICES | 0 | 17545 |
11 | LP2DIFF+BCLT | 0 | 17545 |
12 | sabe | 0 | 17545 |
13 | ASPeRiX | 0 | 17545 |
14 | amsolver | 0 | 17545 |
15 | Cmodels | 0 | 17545 |
16 | Enfragmo | 0 | 17545 |
|