Systematic Sudoku applies human engineering principles to a game, Sudoku solving, with the goal of making it a more pleasurable and more widely used mentally therapeutic activity. Techniques of basic candidate identification, and advanced value placement were described in detail in the first year of the Sysudoku blog, and now in an accessible set of pages, the Guide. There is a systematic process order, the Sysudoku Order of Battle (SORB), designed to concentrate human attention in a generally efficient way.
To further this goal, Sysudoku reviews both the writings of Sysudoku authors and collections of Sudoku puzzles, as they become available. The expert reviews attempt to evaluate claims and theories about human Sudoku solving from a human engineering perspective. In Sudoku, as in current life, this involves judgement as to what is practical for human solving, as opposed to measures only practical in solving by computer coded algorithms.
The objective in collection reviews is to rate collections in order of difficulty, and to provide realistic examples to show readers what is to be gained in playing the Sudoku game efficiently and at more advanced levels.
Ratings by Sudoku publishers are subjective and unreliable. Terms like Very Hard, Extreme, Fiendish, Diabolical, or Nasty have become meaningless. The Sysudoku Order of Battle does provide a fair way out. It is to report a Sysudoku solution path in a way that can be accurately followed. The collection’s difficulty can then be judged by the reader by how far a representative set of puzzles get before collapsing.
That sounds good, but in order for readers at every level to make comparisons, the reports must be accessible at a glance. That is reason for the
Review Table.
The drag off page is a list of collection review tables for all the puzzle reviews. How far you would go from left to right depends on your level of difficulty. Here is a current table:
The left, highlighted column identifies the puzzle number or difficulty rating within the collection.
Under
Gv is the number of givens, of interest to some readers, but not in itself a measure of difficulty.
BY is a number of primary interest to beginning sysudokies, the number of cells reserved in the bypass for specified values, i.e. for subsets, including clues. In any column, the code – – – signals a collapse. Two review puzzles collapse in the bypass in the sample table above.
Box is the number of slinks and aligned triples, then after the slash, the number of cells reserved in box marking.
Line is a tally of the line marking. It’s a set of lists, separated by commas. Each list is headed by the number of free (unreserved) cells on the line, followed by a colon. Lines are marked in increasing order of free cells, taking easier markings first. Changes in a marked line are followed up immediately, and can change the order of marking of remaining lines. List heading numbers can therefore decrease, but generally decrease.
Significant events are signaled by short event codes. For example, in the first puzzle, a rare hidden triple occurs among the lines of 5 free cells. In the 5: list, the code ht counts as one line. Three such lines are marked before the hidden triple and two after. Six consecutive lines of 5 free cells were marked.
When a line marking results in slinks or aligned triples and clues, this result is recorded in parentheses as (slinks, cells reserved) or (cells reserved), without an event code or immediately after one. If the hidden triple above had resulted in three slinks and two clues, the 5 free cell list would have been written 5:3ht(3,2)2. If instead, it had produced 4 clues and no slinks or aligned triples, the list would read 5:3nt(4). If it was 2 slinks and no clues, the list would be 5:3ht(2,0)2.
Event codes used include:
bxl -box/line
np, nt, nq, hp, ht, hq – (naked, hidden)( pair, triple, quad)
hdx – hidden dublex
xw – W-wing
ER – by empty rectangle, an X-chain weak link
And under the heading BV is the number of bi-value cells (bv) left by line marking. Early advanced methods depend on the bv field.
The rightmost column contains brief descriptions of advanced and extreme methods employed for the solution, and any trial results. The order of these methods change in response to results, but generally follows the Order of Battle stages:
Bv scan: unique rectangle (UR), hidden UR); Sue de Coq; XY-chains; XYZ-wings; APE; BARN; XZ_ALS, ALS toxic sets
X-panels: X-chains, grouped X-chains, regular, finned, kraken fish;
Medusa coloring and bridging as available.
Small ALS grid scan: XZ pair match, ALS APE.
X-panels: freeform pattern analysis.
AIC building: AIC ANL, boomerangs, 1-way, complex 1-way.
Lite coloring: Completing extensive but stalled clusters.
Suset ALS enumeration: ALL ALS in three tables
ALS mapping: ALS node AIC, ALS_XZ, ALS-wings
Dear Sir.
Congratulations for your dedication, for your systematic approach to problem solving and for your ability to postulate methodologies.
I have a question for you. I am starting a new project to launch a high level sudoku competition, on line and a great face (in person; sorry for my english) final. However, I know that exists several sudoku solving engines that could convert the tournament into a “cheats cage”. I have some solutions in mind, but no one seems to be good or enough in the long term (irregular sudoku, easy sudokus instead of difficult, avoiding sudoku at all (don’t like it), asking the winner to demonstrate his real abilities before rewarding him, etc). I just have one that could help: obtaining average scores in different games, not only “straight sudoku”, so the cheater could not trick every kind of game in the tournament.
Dear Sir: I am asking you if, with your unusual analysis capabilities, you could find another practical solution; and, more important, if you forsee any other problem or difficult when launching the tournament.
Finally, if you are a MS-Excel fan (like me), I invite you to visit my page (quickrows.com), in which I tried some different way to use Excel.
Thanks in advance and best regards
José Corona
Spain
It’s a great idea. I have a brief report of the Akron Sudoku Tournament held for the first time in November of last year. I was a participant, but had no chance to join the four finalists who worked their puzzles on large white boards on stage. The puzzles were generally too hard to finish, but scoring was on number of correct clues. Three rounds of 20 minutes with three puzzles handed out per round. Then a final puzzle for the four finalists.
That all worked out well for a live contest, but three puzzles is too many for 20 minutes. You should not expect a large number of experts, and I think easy puzzles are best for live contests. The Akron-Summit Library got permission to use Will Shortz puzzles, without individual source attribution.
For an inline contest, there are many problems. The only way to rule out computer based solutions is to ban arbitrary trial and error, and require the logic of each removal and promotion to be spelled out by the solver. Under these rules, you could even allow the contestant’s favorite solver, if it was clearly identified, so that others could acquire them. This would be a time consuming nightmare to adjudicate. It would be promote human solving and an appreciation of sudoku logic.
You are right. it is complicated to avoid automatic solvers in an online tournament. I have an idea that could work. I will try it and will let you know if it worked.
Thanks for your kind response.
Best regards
José
I’d love to see you review the book „Sudoku ultrahardcore 1“:
http://www.ps-heine.de/archives/1322
(Here’s a free excerpt: http://www.ps-heine.de//bibliothek/tipps/Band39_Web.pdf )
It’s from the well-known German puzzle creator Stefan Heine who also creates Sudokus for the World championships. The description says that the Sudokus on the right side of the pages (in this case Sudoku Number 3,4,7,8) are so difficult that „intuition beats logic“ (whilst still having only 1 valid solution of course). My skill-level isn’t high enough to crack them but I love reading your analysis of such hard puzzles.
Thanks for your blog and keep up the good work!
And I’d love to do it. I wouldn’t publish results of the 3, 4, 7, 8 sample without Stefan’s permission, but maybe you could inform us all how to go about getting a copy of Sudoku ultrahardcore 1. Then I could do my usual pre-selection of 10 covering the book or the hardest section. It needs to be practical for Sysudoku readers.
That sounds great!
I think these would be the best places to order the book in the US:
https://www.bookdepository.com/SUDOKU-ultrahardcore-1-Stefan-Heine/9783939940388
https://www.jpc.de/jpcng/books/detail/-/art/stefan-heine-sudoku-ultrahardcore-1/hnum/6245609
https://www.abebooks.com/9783939940388/SUDOKU-ultrahardcore-1-3939940380/plp