It seems like it was a follow-up from previous bruteforce efforts, which include a spreadsheet with various results, but it would help to have some conclusions on which were best: http://forum.redump.org/topic/51851/dumping-dvds-raw-an-ongo...
Also, couldn't find any source/download for DiscImageMender.
So when I used ddrescue, I would read in that block size (instead of just 2048) as if I would get lucky and get a good read (or enough signal that ECC could repair it on the large block).
This was very effective at recovering DVDs with repeated reads vs when I had previously done it with 2048 byte reads only I would end up with 2048 byte reads scattered all over (which if ECC is done on 16x2k 32k byte block size, means there was a lot of data I was leaving on the floor that should have been recovered on those reads).
Ddrescue was also good for this in the sense that if I was trying to recover a DVD (video) from multiple damaged DVDs, as long as they were not damaged in the same location, i was able to fill in the blanks.
Perhaps you can correct me about the 16 block mechanism, perhaps it was just random that it worked and my understanding at the time was wrong.
It's strange to see no mention of cleaning the drives themselves, although maybe it was implicit --- if you have a pile of old drives sitting around, chances are they're not going to be perfectly clean. A tiny bit of dirt on the lens can have a huge effect on the read signal, especially on a marginal disc.
Related article from 18 years ago: https://news.ycombinator.com/item?id=21242273
boltzmann-brain•2d ago