Link to paper
For nearly 50 years theorists believed that if solving a problem takes t steps, it should also need roughly t bits of memory: 100 steps - 100bits. To be exact t/log(t).
Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.
log to what basis? 2 or e or 10 or...
Why do programmers have to be so sloppy?
Highly recommend
So it's more like polar star. Maybe not directly practical, but it will lead tons of people in the right direction.
As-in algorithms you'll care about if you're a programmer and not overly interested in theory? Not really, no.
Analogy is thermodynamics says how efficient a heat engine _could_ be. If your engine is way below that you know there how much of an improvement there _could_ be, that it's not an impossible problem. That will get people to build better stuff.
Memoization is likely the best you can do.
For algorithms, a little memory outweighs a lot of time. 343 points, 139 comments, 39 days ago. https://news.ycombinator.com/item?id=44055347
You need much less memory than time. 126 points, 11 comments, 22 days ago. https://news.ycombinator.com/item?id=44212855
baruchel•2d ago