frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

“Dynamic Programming” is not referring to “computer programming”

https://www.vidarholen.net/contents/blog/?p=1172
248•r4um•3d ago•123 comments

The Daily Life of a Medieval King

https://www.medievalists.net/2025/07/medieval-king-daily-life/
138•diodorus•3d ago•71 comments

Perl Versioning Scheme and Gentoo

https://wiki.gentoo.org/wiki/Project:Perl/Version-Scheme
22•RGBCube•1d ago•15 comments

Staying cool without refrigerants: Next-generation Peltier cooling

https://news.samsung.com/global/interview-staying-cool-without-refrigerants-how-samsung-is-pioneering-next-generation-peltier-cooling
329•simonebrunozzi•15h ago•242 comments

Log by time, not by count

https://johnscolaro.xyz/blog/log-by-time-not-by-count
149•JohnScolaro•11h ago•47 comments

Extend (YC W23) is hiring engineers to build SOTA document processing

https://jobs.ashbyhq.com/extend
1•kbyatnal•17m ago

ESP32-Faikin: ESP32 based module to control Daikin aircon units

https://github.com/revk/ESP32-Faikin
76•todsacerdoti•8h ago•30 comments

Show HN: X11 desktop widget that shows location of your network peers on a map

https://github.com/h2337/connmap
162•h2337•12h ago•67 comments

XMLUI

https://blog.jonudell.net/2025/07/18/introducing-xmlui/
542•mpweiher•22h ago•288 comments

Debugging Bash Like a Sire (2023)

https://blog.brujordet.no/post/bash/debugging_bash_like_a_sire/
54•gfalcao•3d ago•21 comments

New colors without shooting lasers into your eyes

https://dynomight.net/colors/
451•zdw•3d ago•124 comments

The sumerian game early computer game

https://spillhistorie.no/2025/07/10/the-sumerian-game-the-ancestor-of-modern-city-builders/
27•christkv•2d ago•6 comments

Agents built from alloys

https://xbow.com/blog/alloy-agents/
131•summarity•11h ago•63 comments

Super-resolution microscopes reveal new details of cells and disease

https://knowablemagazine.org/content/article/technology/2025/super-resolution-microscopes-reveal-new-details-cells
10•rbanffy•2d ago•2 comments

Simulating hand-drawn motion with SVG filters

https://camillovisini.com/coding/simulating-hand-drawn-motion-with-svg-filters
236•camillovisini•4d ago•17 comments

Coding with LLMs in the summer of 2025 – an update

https://antirez.com/news/154
526•antirez•1d ago•359 comments

Structuring Arrays with Algebraic Shapes [video]

https://www.youtube.com/watch?v=3Lbs0pJ_OHI
23•surprisetalk•2d ago•1 comments

France launches criminal probe of X over alleged algorithm ‘manipulation’

https://www.ft.com/content/21818d23-71d7-45a4-ae8c-e7940f5d9e00
11•aspenmayer•55m ago•6 comments

Stdio(3) change: FILE is now opaque

https://undeadly.org/cgi?action=article;sid=20250717103345
150•gslin•17h ago•71 comments

Hexanitrogen Energies

https://www.science.org/content/blog-post/hexanitrogen-energies
12•thomasjb•2d ago•4 comments

Using the Matrix Cores of AMD RDNA 4 architecture GPUs

https://gpuopen.com/learn/using_matrix_core_amd_rdna4/
63•ibobev•2d ago•3 comments

Show HN: Conductor, a Mac app that lets you run a bunch of Claude Codes at once

https://conductor.build/
195•Charlieholtz•3d ago•87 comments

AI is killing the web – can anything save it?

https://www.economist.com/business/2025/07/14/ai-is-killing-the-web-can-anything-save-it
258•edward•1d ago•311 comments

How to handle people dismissing io_uring as insecure? (2024)

https://github.com/axboe/liburing/discussions/1047
88•nromiun•6h ago•84 comments

Speeding up my ZSH shell

https://scottspence.com/posts/speeding-up-my-zsh-shell
195•saikatsg•20h ago•97 comments

What my mother didn’t talk about (2020)

https://www.buzzfeednews.com/article/karolinawaclawiak/what-my-mother-didnt-talk-about-karolina-waclawiak
69•NaOH•3d ago•29 comments

How slow motion became cinema’s dominant special effect

https://newrepublic.com/article/196262/slow-motion-became-cinema-dominant-special-effect-downtime
37•cainxinth•3d ago•31 comments

IPv6 Based Canvas

https://canvas.openbased.org/
68•tylermarques•13h ago•14 comments

SIOF (Scheme in One File) – A Minimal R7RS Scheme System

https://github.com/false-schemers/siof
52•gjvc•2d ago•5 comments

FFmpeg devs boast of another 100x leap thanks to handwritten assembly code

https://www.tomshardware.com/software/the-biggest-speedup-ive-seen-so-far-ffmpeg-devs-boast-of-another-100x-leap-thanks-to-handwritten-assembly-code
342•harambae•15h ago•96 comments
Open in hackernews

“Dynamic Programming” is not referring to “computer programming”

https://www.vidarholen.net/contents/blog/?p=1172
246•r4um•3d ago

Comments

smokel•6h ago
Also, the term "dynamic programming" has slightly different meanings in the context of leetcode problems and in the context of reinforcement learning.

In the first it's for optimizing a relatively small deterministic system, in the latter it is for optimizing a stochastic process.

Pretty confusing.

jeltz•29m ago
The idea is way older than both leetcode and reinforcement learning and is used everywhere (for example when planning SQL queries). If reinforcement learning invented a new way to use the word then that is all their fault because leetcode is true to the original meaning.
mrbluecoat•6h ago
Reminds me of "Extreme Programming" [0], which is referring to project programming rather than computer programming.

[0] https://en.m.wikipedia.org/wiki/Extreme_programming

rzzzt•5h ago
Compare with "Extreme Ironing": https://www.ebaumsworld.com/pictures/23-insane-examples-of-e...
userbinator•5h ago
If it weren't for the fact that the page is 10 years old, I would've thought those were AI-generated images at first glance.
stephenlf•6h ago
I love that origin story.
gsf_emergency_2•6h ago
It seems dynamic programming (some predetermined sequence of steps) as envisioned by Bellman is strictly less dynamic than ordinary programming today, hence the confusion?

Elevating "memoization+recursion" to "dynamic programming" seems like a development that Bellman didn't anticipate

https://cs.stackexchange.com/questions/99513/dynamic-program...

There is another mid century term which is oddly specific for something oddly general (or the other way?) "Operations Research"

Guess Bellman should have called it "Reactive Optimization", then we would have the totally appropriate "Recursive Memoization" or "Memoized Recursion"

LPisGood•6h ago
You should think of the “programming” in dynamic programming the same way you think of in linear programming, integer programming, and constraint programming.

Indeed even in layman’s terms, thinking of it as in television programming is more accurate than thinking it is related to computer programming (as is mentioned in TFA)

TeMPOraL•5h ago
> linear programming, integer programming, and constraint programming

Can't think of a universe in which you'd learn about these things before learning "computer programming".

tgma•5h ago
There was a universe before digital computers were popular or a thing at all.

Computing predates computers.

Thorrez•3h ago
Before digital computers existed, there were still computers. They were people. A baker is a person who bakes. A computer was a person who computes (computing mathematical results).
opnac•4h ago
In the UK at A-level (age 16-18) you may still be taught linear and dynamic programming before ever touching a line of code! (Indeed, that was the same for me!)
mr_toad•1h ago
I know two people who are experts in linear programming who have never written a single line of code.
agumonkey•5h ago
I always find funny how many meanings "programming" has.
thaumasiotes•4h ago
Television programming isn't a separate meaning from computer programming. Programming means creating a program. The program, whether you're talking about a computer, a television channel, or a live staged performance, is a list of what's going to happen in what order.
mort96•3h ago
I think of "programming" in "dynamic programming" the exact same way I think of it in "linear programming", "integer programming" and "constraint programming": it's probably some kind of software development that some computer scientists came up once and that I don't need to think about, because my normal programming has worked out pretty well so far

(Except, well, I guess I understand what "dynamic programming" is more than I understand what the other forms of programming you mention is; "dynamic programming" is solving certain classes of recursive problems by using arrays, sometimes nested arrays, to avoid re-computation, and somehow that's supposed to be more "dynamic" than not doing that)

Certhas•5h ago
The sequence of steps is the result of dynamic programming. Not every sequence of steps is a dynamic programme. And (what I would argue are) the core results are fairly mathematical, memoization and recursion don't feature, but partial differential equations do. Check out the Hamilton Jacobi Bellman equation to get a flavour.
thaumasiotes•4h ago
> Elevating "memoization+recursion" to "dynamic programming" seems like a development that Bellman didn't anticipate

Well, it also isn't a development that happened. You can (always) implement dynamic programming as recursion with memoization. But an algorithm that is recursive and memoized isn't necessarily an example of dynamic programming.

The point of dynamic programming is that you set up the recursion so that recursive calls do hit the cache, not that there is a cache and if you're lucky they might hit it.

zelphirkalt•3h ago
Usually though when using recursion to solve an algorithmic problem, people memoize results they know will be needed later again. So in most cases that should automatically become the same thing in practice.
npsomaratna•6h ago
Sri Lankan here. I used to compete in the IOI (International Olympiad in Informatics), back in the '90s.

I aced the competition in Sri Lanka, but I failed to win a medal the first time I competed internationally. I solved several problems with recursion, but these failed to complete within the strict time limits allowed.

One of the contestants from another country told me: "You needed dynamic programming to solve problems 1 and 5." I then spent the next year trying to figure out what dynamic programming was. The folks over here weren't familiar with the term. In fact, I was often asked "did you mean dynamic memory allocation?"

After a while, I managed to find a book that talked about dynamic programming, and I was like "Oh, it's recursion + storing results" and "Ah, you can also do this iteratively in certain circumstances"

Bottom line, armed with this new knowledge, I ended up winning a gold medal at IOI 2001. Fun times.

zekica•6h ago
I had the same experience: no one knew of my mentors what "dynamic programming" was and our country-level competition (that had created problems inspired by IOI) required dynamic programming for 2 out of 5 problems. And of course I failed the first time (2004). Then I learned what it was about and aced the next time (2005).
snthpy•5h ago
I was part of the hosting team at IOI 1997 in Cape Town and had similar conversations with the participants that you described and also learned about Dynamic Programming there. Your description of it as "recursion + storing results which you can sometimes do iteratively" is the best summary. Good example is iterative Fibonacci algorithm.

Another fun anecdote from that time is that we had to take special precautions with the scoring program which connected to the participants machine over the serial port. The previous year one participant had hacked the serial port interface to manipulate the results. :-)

npsomaratna•3h ago
Nice! I guess you must know Bruce Merry. Once, when practicing, I kept on hitting my head on a problem. On the spur of the moment, I decided to fire off an email to Bruce (given his track record, I was like "if anyone can figure this out, he can"). I was shocked (and very pleased) to get a reply a few days later outlining his thought process and how he went around solving the problem.

I used to know the ZA team leaders as well (after competing, I was a team leader for several more years).

krackers•5h ago
Has there been an "arms race" of sorts with programming contests, as knowledge percolates and harder categories of problems need to be chosen? Since these days dynamic programming is basically table stakes for ICPC-type problems.
paldepind2•3h ago
Yes, absolutely. I did programming competitions back in high-school (around 10 years ago) and common folklore was that back in the days knowing dynamic programming could win you a medal, but today it was just basic expected knowledge.
bubblethink•3h ago
There's a lot of variety in DP. Knowing about DP doesn't help much with solving DP problems. I'm sure you've seen all the fun problems on codeforces or leetcode hards. The one quote I remember from Erik Demaine's lecture on DP is where he comments, "How do we solve this with DP ? - With difficulty."
npsomaratna•3h ago
That's my impression as well.

I think it's because access to knowledge has become a lot easier, so contestants know more; plus, the competitions themselves have become more organized.

For example, last I checked, the IOI had a syllabus outlining the various algorithms contestants needed to know. During my time, it was more of a wild west.

kindkang2024•2h ago
That's how Competition Makes All Great Again. Looks like it's by God's design. ^_^
kindkang2024•4h ago
Loved your wonderful story—here’s my plain one.

I came to programming after I graduated, and I never really understood Dynamic Programming until I watched one of Stanford’s algorithm courses (kudos to MOOCs).

But honestly, I’ve never done anything truly useful with it—except that it gave me a new perspective on ordinary things. For example, I found that there may be great wisdom in the Taiji symbol (image here: https://commons.wikimedia.org/wiki/File:Yin_yang.svg). We can divide the current problem and conquer it at the lower levels (like one of the dots in yin and yang). And of course, those lower levels still contain their own divisions and computations—until there are none left.

At the same time, we can also start from the base and build upward (bottom-up), moving toward higher-level computations—where possible divisions await, until eventually one big unified answer emerges.

Pretty useless for its actual usage here, isn’t it? ^^

matsemann•4h ago
Dynamic programming is for me one of the things that really scratched an itch and made me start liking programming. The invariant/inductions of composing smaller things into a bigger solution is so elegant.

I do feel dynamic programming influences how I solve problems in my work. How, if you compose the problem correctly, it can never reach a fault state in a sense. Not sure if I'm able to explain what I mean, but for instance I recently refactored an old application at work that had lots of bugs, was hard to grasp and had loads of error checking code / asserts. However, by composing it a bit differently, modeling it properly, I could remove swaths of code / defensive programming because I with the help of the compilator now can guarantee the base cases holds etc.

Edit: one a bit contrived example might be an Order. If you have one big model tracking the lifetime of an order, some fields will have to be null until they have a value. For instance a field sent_at_time. Then you have a function that generates an email to the customers, which uses that field. How do you handle that field possibly being null? Do you say "I know that at this point in time, this field is always set since the email is generated after sending" and tell the compiler to ignore the possibly null value? Or do you handle the possible null with some error handling / fallback value, that in practice is dead and cluttered code and will just confuse a future developer seeing it with questions like "how can this happen"?

With the lessons from dynamic programming in mind, I think both are bad solutions. You want to be able to go from state n to n+1 with safety (order purchased, ordered sent, email sent). So model it in a way that the email sending code only ever can get an order in the correct previous state as input. Then if the assumption were to break in the future (maybe some orders are digital and the email is sent without the order having been physically sent), that breaking assumption will immediately be obvious in how the state/models are composed, and not in runtime when either the not-null-assert fails or the emails start having the fallback value.

lock1•3h ago
Maybe "Making illegal state unrepresentable" is the term you're looking for?

https://fsharpforfunandprofit.com/posts/designing-with-types... https://inside.java/2024/06/03/dop-v1-1-illegal-states/

IMO it's doesn't have much relation with dynamic programming though.

matsemann•2h ago
It was mainly just one example of the thought pattern that I feel translates. In my head it has a relation. In dynamic programming, you compose a solution of smaller, but still valid solutions. When calculating fib(8), you can trust that combining fib(7) and fib(6) is valid, and so on down to the base case. If you apply that concept to other parts of programming, it's much easier to reason about.
nicknash•3h ago
I wonder if you can clear up a memory of mine from IOI 2001. The first day results were delayed, and I seem to remember this is because a contestant encoded the search in one of the problems into a compile time c++ template metaprogram. On the second day, there was then also a compile time limit for solutions, from what I remember. Do you remember any more details of this?
xandrius•3h ago
That sounds very smart :D
amelius•3h ago
But aren't the programs given access to the problem data _after_ the program has been compiled?
colechristensen•2h ago
There are plenty of opportunities to precompute _everything_ for a certain class of problems.
fer•2h ago
Sure, but the input might be bounded/finite, or the operations needed similarly constrained (e.g. trigonometry operations). Then you can offload lots of the computation to the compilation, sometimes all of it.
SkiFire13•1h ago
Yeah, but they didn't precompute the solution for that specific program data, they precomputed the solution for all possible ones, and then selected the correct one when the program data was provided.
npsomaratna•3h ago
No, sorry. I vaguely remember compile time limits, but they were high enough (30 seconds, I think?) that I didn't bother worrying about them (at least, that's my memory).
hnfong•2h ago
I think I know the culprit. The story I was told is that the person precomputed some of the results and submitted a huge source file (in tens of megabytes) of hard coded data. This probably led to subsequent amendment of the rules regarding source file sizes.

I'll report back once I find the persons involved.

npsomaratna•53m ago
Wasn't this for one of the questions on the second day? Where you had to submit a series of outputs, but these were trivial to generate once you cracked the problem.

I remember getting an "aha" moment, writing a program, and then submitting (scored 100% too!). Then, I met a guy who also cracked the problem and realized that he just needed to paste the test data into a spreadsheet, do some basic sorting there, and then paste the output into a text file; no coding necessary.

I felt pretty stupid afterwards.

mort96•3h ago
One thing I always wondered is: what makes dynamic programming more "dynamic" than regular programming? Why is recursion "static" but it becomes "dynamic" if you store results?
bazoom42•3h ago
According the article, the term “dynamic” was chosen because it has positive connotations.
moffkalast•2h ago
Because fuck descriptive names right?
jagged-chisel•2h ago
Two hard problems in computer science…
fragmede•2h ago
1. Cache invalidation

2. Naming things

3. Off-by-one errors

lvncelot•1h ago
0. Race conditions
_0ffh•2h ago
If you have to impress the bureaucrats to get funding, you go and impress the bureaucrats.
DonHopkins•2h ago
Oh, then you will really love and want to fuck the name of the "Wave Function Collapse" algorithm! (Maybe Chuck Tingle will write a book about it.) Also "Extreme Learning Machines", "Reservoir Computing", "Liquid State Machines", and "Crab Computing" (I shit you not)! Don't even get me started on "Moveable Feast Machines".

[Welcome to Coffee Talk, with Linda Richman. Todays topic: Wave Function Collapse is neither quantum physics, particles, waves, nor collapsing functions. Discuss amongst yourselves, while I am constrained to solve all these Sudoko puzzles. Oh, I am so verklempt!]

https://news.ycombinator.com/item?id=42749675

DonHopkins 6 months ago | parent | context | favorite | on: Generating an infinite world with the Wave Functio...

Ha ha, great comment! You're right, WFC's name just refers to quantum mechanics because it sounds cool, not because the algorithm has anything to do with physics, waves, or collapsing functions (it's more just a constraint based Sudoko solver). But I perceive it more as a playful buzzword for a useful algorithm, and not as maliciously deceptive like Deepak Chopra, who I was just commenting on recently (Quantum was cool until Deepak Chopra ruined it for everybody):

https://news.ycombinator.com/item?id=42714477

I do think it's useful combined with other techniques like procedural programming, cellular automata, Voronoi diagrams / jump flooding, noise, etc, to steer the higher and lower level structures. Check out Maxim Gumin's work on Markov Junior!

https://twitter.com/ExUtumno/status/1531992699997507586

Maxim Gumin @ExUtumno: I published a new project about combining rewrite rules and solving constraint problems made of rewrite rules

https://github.com/mxgmn/MarkovJunior

I think the really nice property of WFC is that it's very naturally and easily artist-driven. It doesn't require programming skills to use, you just supply it with a bunch of concrete before and after examples. (Like "Programming by Example".) That makes it much easier for a wide range of people to use, which is an important feature for democratizing custom world generation. Especially when you can construct or discover the examples in-world, and have it operate kind of like a 2-d tile or 3-d cube auto-complete, that looks at the rest of the stuff you've built or seen, like Canvas does with 1-d code.

https://news.ycombinator.com/item?id=42751176

DonHopkins 6 months ago | parent | context | favorite | on: Generating an infinite world with the Wave Functio...

The difference between Deepak Chopra's abuse of Quantum Physics terminology and WFC's is that WFC actually works and is useful for something, and its coiner publishes his results for free as open source software and papers, so he deserves more poetic license than a pretentious new-age shill hawking books and promises of immortality for cash like Deepak.

Here are some notes I wrote and links I found when researching WFC (which is admittedly a catchier name than "Variable State Independent Decaying Sum (VSIDS) branching heuristics in conflict-driven clause-learning (CDCL) Boolean satisfiability (SAT) solvers"):

https://donhopkins.com/home/wfc-notes.txt

    Here are some notes I wrote and links I found when researching Wave
    Function Collapse (WFC). -Don Hopkins

    Wave Function Collapse

    Maxim Gumin

    Paul Merrell

    https://paulmerrell.org/research/

    https://paulmerrell.org/model-synthesis/

    Liang et al

    Jia Hui Liang, Vijay Ganesh, Ed Zulkoski, Atulan Zaman, and
    Krzysztof Czarnecki. 2015. Understanding VSIDS branching heuristics
    in conflict-driven clauselearning SAT solvers. In Haifa Verification
    Conference. Springer, 225–241.

    WaveFunctionCollapse is constraint solving in the wild

    https://escholarship.org/content/qt1f29235t/qt1f29235t.pdf?t=qwp94i

    Constraint Satisfaction Problem (CSP)
    Machine Learning (ML)
[...lots more stuff...]

Even more fun stuff about WFC, AgentSheets, and defining cellular automata rules by example:

https://news.ycombinator.com/item?id=42749578

And now about "Extreme Learning Machines" and "Liquid State Machines":

https://news.ycombinator.com/item?id=42750857

DonHopkins 6 months ago | parent | context | favorite | on: Generating an infinite world with the Wave Functio...

Like calling random vector functional link networks and single-layer feed-forward networks with random hidden weight an "Extreme Learning Machine".

https://en.wikipedia.org/wiki/Extreme_learning_machine#Contr...

>Controversy

>There are two main complaints from academic community concerning this work, the first one is about "reinventing and ignoring previous ideas", the second one is about "improper naming and popularizing", as shown in some debates in 2008 and 2015.[33] In particular, it was pointed out in a letter[34] to the editor of IEEE Transactions on Neural Networks that the idea of using a hidden layer connected to the inputs by random untrained weights was already suggested in the original papers on RBF networks in the late 1980s; Guang-Bin Huang replied by pointing out subtle differences.[35] In a 2015 paper,[1] Huang responded to complaints about his invention of the name ELM for already-existing methods, complaining of "very negative and unhelpful comments on ELM in neither academic nor professional manner due to various reasons and intentions" and an "irresponsible anonymous attack which intends to destroy harmony research environment", arguing that his work "provides a unifying learning platform" for various types of neural nets,[1] including hierarchical structured ELM.[28] In 2015, Huang also gave a formal rebuttal to what he considered as "malign and attack."[36] Recent research replaces the random weights with constrained random weights.[6][37]

But at least it's easier to say, rolls off the tongue smoothly, and makes better click bait for awesome blog postings!

I also love how the cool buzzwords "Reservoir Computing" and "Liquid State Machines" sounds like such deep stuff.

https://news.ycombinator.com/item?id=40903302

>"I'll tell you why it's not a scam, in my opinion: Tide goes in, tide goes out, never a miscommunication." -Bill O'Reilly

How about rebranding WFC as "Extreme Liquid Quantum Sudoko Machines"? ;)

Then there's "Crab Computing"!

https://news.ycombinator.com/item?id=42701560

[...] If billiard balls aren't creepy enough for you, live soldier crabs of the species Mictyris guinotae can be used in place of the billiard balls.

https://web.archive.org/web/20160303091712/https://www.newsc...

https://www.wired.com/2012/04/soldier-crabs/

http://www.complex-systems.com/abstracts/v20_i02_a02.html

Robust Soldier Crab Ball Gate

Yukio-Pegio Gunji, Yuta Nishiyama. Department of Earth and Planetary Sciences, Kobe University, Kobe 657-8501, Japan.

Andrew Adamatzky. Unconventional Computing Centre. University of the West of England, Bristol, United Kingdom.

Abstract

Soldier crabs Mictyris guinotae exhibit pronounced swarming behavior. Swarms of the crabs are tolerant of perturbations. In computer models and laboratory experiments we demonstrate that swarms of soldier crabs can implement logical gates when placed in a geometrically constrained environment.

https://www.futilitycloset.com/2017/02/26/crab-computing/

And of course "Moveable Feast Machines":

https://news.ycombinator.com/item?id=15560845

https://news.ycombinator.com/item?id=24157104

https://news.ycombinator.com/item?id=34561910

The most amazing mind blowing MFM demo is Robust-first Computing: Distributed City Generation:

https://www.youtube.com/watch?v=XkSXERxucPc

ineedasername•1h ago
Not quite sure what this all is, but it is an interesting way to wind up in the a.m. fully reminded that I am not and have never been, nor will be, as smart as I would like to be, but it’s an enjoyable ride. Thanks
bonoboTP•1h ago
See https://github.com/fcampelo/EC-Bestiary
marginalia_nu•1h ago
Marketing matters almost too much in programming.

You'd hardly see people rallying behind concepts like pure functions or clean code if they were called brown functions or moist code.

nemomarx•1h ago
Hydrated pages seems to work though, so you could maybe swing wet or dry code?
DonHopkins•20m ago
As I always say, some people think DRY stands for Do Repeat Yourself. You can say that again!
hnfong•2h ago
When I learned dynamic programming, I was told this was in contrast to simple memoization in that memoization is a cache of computation, dynamic programming involves a "choice" in each step.

For example let's look at the fibonacci sequence, F(n) = F(n-1) + F(n-2), naively you just fill in a (1 dimensional) table with the results [1, 1, 2, 3, 5, 8, 13...] and it feels static because there's nothing to choose or decide.

In contrast for example the longest common subsequence looks like this:

  lcs[m, n] = 0                 if m = 0 or n = 0
  lcs[m, n] = lcs[m-1, n-1] + 1  if X[m] = Y[n]
  lcs[m, n] = max(lcs[m-1, n], lcs[m, n-1])  if X[m] != Y[n]
At every step there's a "choice" to be made depending on the situation. The functions like "max()" are quite typical for dynamic programming. At least this is how I interpret the "dynamic" part.
ncruces•3h ago
Oh that's the year I went. Trip down memory lane. I was woefully unprepared. The mobile phone cell summation problem underlined how much I didn't know, and later figuring it out, how much I had to learn; it cemented my love for the field. I just loved the experience.
mrits•1h ago
As a US born native English speaker, I always struggled with the phrase. Eventually I just mentally called it a misnomer as the phrase provided no value to the actual problem set they described.
Sesse__•1h ago
For my first programming competition (the national ICPC prequalifier, in 2002), I asked my teammates (who I didn't know very well, we had just cobbled together a team) a couple of minutes before the contest what this dynamic programming thing was. One of them was “oh, it's only caching the output of a recursive function” (technically, that's memoization, but who cares).

I ended up solving a problem using DP a couple of minutes before the deadline, which was enough to get us to 3rd and to ICPC. Fun times.

ygritte•6h ago
> Try thinking of some combination that will possibly give [the word, dynamic] a pejorative meaning. It’s impossible*.

> *This Haskell fan’s contender is “dynamic typing” :P

Nothing is impossible!

zoky•5h ago
It’s certainly possible to use it as a euphemism, if not an outright pejorative. “Dynamic truthfulness”, “dynamic sense of morality”, etc.
eru•5h ago
There's also dynamic scoping; as opposed to lexical scoping a.k.a. static scoping.

You can find defenders of dynamic typing, but dynamic scope is now widely seen as a mistake. Or at least dynamic scope by default; it has specialised uses--and in Haskell the Reader Monad is basically isomorphic to dynamic scoping and no one complains about it.

ygritte•5h ago
> dynamic scoping

Right you are. Even more horrible. The tcl language still has it!

eru•5h ago
I think EmacsLisp also does dynamic scoping. Or at least they used to about ten years ago. Not sure, if they fixed it?
jrapdx3•4h ago
Well, Tcl is kind of a mixture of scope rules. In some respects, e.g., within a proc, it's mainly a lexical environment, but of course dynamic scoping is introduced by commands like upvar/uplevel. FWIW Tcl programmers don't concern themselves very much with sorting out dynamic vs. lexical. In any case, Tcl programmers are careful to maximize static scoping. No doubt that's necessary to create reliable larger programs many of which have been written in Tcl.
geocar•4h ago
Dynamic scope/binding is super useful, but I do not agree tcl does it or does it well, because it's not just shadowing the closure environment (ha), or changing globals in a dynamic-wind.

Perl's the only non-lisp I'm aware of that does dynamic binding well-enough to get the idea, but too many things just don't work:

    local *int = sub { return 69 }; print int("42");
is a particularly annoying one: You just can't do it. Other names might have better chances, e.g.

    package foo; sub int { 42 };
    package main; sub a{local *foo::int = sub { return 69 };&b(@_);} sub b{goto \&foo::int};
    print b("x"), a("x"), b("x"), a("x");
but this is a little unsatisfying.
eru•3h ago
> Perl's the only non-lisp I'm aware of that does dynamic binding well-enough to get the idea, [...]

Well, Haskell does dynamic binding in the form of the Reader Monad. It's well received there.

geocar•2h ago
Erm no. The point is I want to have the ability to make any variable dynamic, not just work with a global variable that is really a getter.
eru•1h ago
The reader monad doesn't need to be global.

But yeah, by that standard, Haskell implements both dynamic scoping and mutable state via getters and setters. (Though, of course, you can go all out on these via lenses.)

umanwizard•1h ago
So does emacs lisp, by default, although you can set it to lexical scope per-file (and it’s recommended to always do so).
rat87•4h ago
Isnt dynamic scoping a bit similar to Dependency Injection? Maybe there's a form of it that could be useful. Like explicit dynamic scopes?
eru•3h ago
Well, isn't dependency injection just a more cumbersome way to say 'function arguments'? Dynamic scoping is exactly the same, it's basically equivalent to extra implicit function arguments that are passed through everywhere.

And yes, dynamic scopes can be useful in specific cases. What's wrong is having your variables scoped dynamically by default.

commandersaki•6h ago
Same for "linear programming".
IshKebab•6h ago
Possibly the worst named thing in computing? Not only is "programming" not referring to programming, but "dynamic" is meaningless and just there to try to make it sound fancy.

I prefer "cached recursion".

eru•5h ago
Well, cached recursion is a bit too generic. Almost anything can be done with both caches and recursion.

We had a different characterisation in one of our mathematical optimisation classes: dynamic programming is basically every problem that isomorphic to finding the longest path in a graph, with suitable 'overloading' of 'maximum' (for picking among multiple alternative paths) and 'addition' (for combining multiple sub-paths).

First, to illustrate what I mean by this overloading: matrix multiplication usually has multiplication () and addition (+). However, in the min-plus-algebra you overload (+) with minimum and () with plus, and then multiplying matrices becomes equivalent to hopping two paths in the adjacency matrix of a graph. (Sorry, if this is a bit confusing.) Specifically, taking an adjacency matrix A and calculating A* := 1 + A + AA + AAA + ... is equivalent to taking the transitive closure of edge hopping in your graph, ie the shortest paths between all pairs of vertices.

If you overload (+) with maximum and () with plus instead, you can find longest paths. For that to make sense, you should not have cycles in your graph.

Now let's go back to dynamic programming: the shared optimal sub-structure property of dynamic programming is exactly the same as what you have in finding longest paths.

The structure might sound overly restrictive, but you'll find that it works for all (most?) examples of dynamic programming.

P.S. My recollection was a bit vague, so I asked ChatGPT for some help, and apparently I wasn't completely wrong: https://chatgpt.com/share/687de24c-c674-8009-9984-0fda56d1c1...

P.P.S. Despite everything I said above, I agree that 'cached recursion' is still a better name than dynamic programming.

tgma•5h ago
I actually think overly focusing on the memoized recursion undermines the core conception of dynamic programming. Remember, as noted in the post, the term programming refers to a tabular method of planning, i.e. specifically, the bottom-up calculation to solve a problem. The emphasis here is on a predetermined order on how to tabulate the plan to get to the desired result which will be the solution to the broader problem.

The memoized recursive function is an implementation methodology on a digital computer that specifically side-steps the question of in which order one should do the tabulation. I would argue that recursive memoization is a possible solution technique to dynamic programming problems (defined as the universe of problems that admit a dynamic programming solution,) but strictly speaking, in and of itself, is not dynamic programming.

eru•3h ago
I'd actually go the other way round: figuring out a suitable recurrence relation is usually the first step in any dynamic programming solution. As a next step, sure, you can explicit construct a table to fill in, but that's not actually a requirement: dynamic programming also works when your parameters aren't things like numbers that you can easily use to enumerate rows in a table.

Memoisation is a more general tactic that hands more of the busy work over to the computer.

317070•5h ago
I think "cached recursion" is too broad. I tend to go with "Minimal Memory Memoization".

1) the recursion solution is often too slow, but uses little memory.

2) the memoization solution can make the algorithms from 1 a lot faster, but blows up memory use.

3) the dynamic programming solution only keeps the previous partial solutions in memory that will be needed for future partial solutions. Therefore it is the "Minimal Memory Memoized" solution. It often requires a smart ordering of partial solutions that allows for the earliest cache eviction.

Your "cached recursion" sounds like number 2 to me, and the crux about dynamic programming is to figure out when to remove an entry from your cache.

zelphirkalt•3h ago
Even just keeping the partial solutions in memory that will later be needed can use much more memory than a bottom up solution. Recursion being slow depends on the implementation of your programming language.
317070•1h ago
> Even just keeping the partial solutions in memory that will later be needed can use much more memory than a bottom up solution.

Can you give an example of this? I don't think that is correct. But we might be saying the same thing, because in practice by the time you achieve minimal memory, you will have a bottom-up solution.

Certhas•4h ago
Dynamic is definitely not meaningless. I am familiar with dynamic programming from the optimal control theory point of view, and it's squarely focused on dynamical systems. The central equation of dynamic programming is the Hamilton-Jacobi-Bellmann equation, a generalization of the Hamilton-Jacobi equations of classical mechanics. And the terminology programming in this context also exists in linear programming and stochastic programming. It's definitely confusing today, but these terms simply predate the modern use of programming, as well as modern computers. I am honestly a bit puzzled by everyone in this comment section talking about recursion + memoization (as a way to efficiently implement backward induction I presume?). It seems like computer science teaches a particular application and implementation of dynamic programming ideas and people mistake that for the whole thing?
tgma•4h ago
Well, to be fair, you are questioning terminology bastardization in "computer science," a field that is named after "computers" and "science" but has little to do with either. One should temper their expectations :) Informatics, a commonly used term in Europe, would have been a much better name.
eru•3h ago
Well, just pretend that the whole discussion happened in German, where despite using your preferred term of Informatik, they still talk about Dynamische Programmierung.
zelphirkalt•3h ago
Some of these terms sound sooo awkward in German. I vote for calling it memoization.
eru•2h ago
Well, memoisation ain't exactly the same as dynamic programming. You can use memoisation to do dynamic programming, but you can also use it in other contexts (and you can implement dynamic programming in different ways, too).
tuukkah•3h ago
However, information science (and in Finnish, informatiikka) is a synonym of library science. Computer science isn't too bad a name for a science about computing on (abstract) machines.
mollerhoj•50m ago
I prefer the Danish term: Datalogi
mort96•3h ago
I don't understand most of what you wrote. But can you, in simple terms, explain to me why recursion without caching is "static"?
Certhas•1h ago
Why do you think that what I wrote implies they are static?

I didn't say that recursion and caching are the opposite of dynamic, I said they are essentially orthogonal concepts to it.

Generally speaking, dynamical systems are systems that have some sort of dependence on time [1]. In dynamic programming we have an optimization/decision problem in which time plays an essential role. The optimality is measured with respect to the whole time evolution. Bellmann showed that you can break this intertemporal optimization down into individual time steps.

So the key observation of dynamic programming is how to turn an "all times at once" problem into a "one time at a time" problem.

The reason that Hamilton's name shows up next to Bellmann here is that physicists and mathematicians (Lagrange, Hamilton, Jacobi) figured out that you can do the inverse: You can write the dynamical equations of physics as an intertemporal (all times at once) optimization problem. This has been one of the most influential ideas in theoretical physics _ever_, especially after Noether's theorems leveraged this structure to connect symmetries and conserved quantities. In many fields of theoretical physics you no longer write down differential equations to model a dynamical system, you write down a "Lagrangian", and mean that your system follows those trajectories that minimize the integral of the Lagrangian over all time. So the central ideas of dynamic programming are extremely tightly related to physics and dynamical systems.

[1] https://en.wikipedia.org/wiki/Dynamical_system

Edit: After reading a bit more I think I understand why people focus on this. The algorithmic examples with shortest paths sort of show the point. To me the core point of Dynamic Programming is before you start thinking about recursion or memoization. It's when you establish that you have optimal substructures:

https://en.wikipedia.org/wiki/Optimal_substructure

what is dynamical is that the subproblems depend on each other, in the way that what happens at time t+1 depends on what happens at time t, or, more to the point, what is the optimal decision at time t depends on what the optimal decision at time t+1 is. If your subproblems are ordered by time, then of course you don't need recursion to solve them, you just iterate over time steps, hence my confusion.

rochak•4h ago
Yup. I don’t mind it though as this field is riddled with names that convey absolutely nothing about the actual thing.
emmelaich•2h ago
It's deliberately obscure. Per the article it's for promotion not description.
tgma•5h ago
Programming terminology is in the same vein as "Linear Programming..."

As per Wikipedia, that origin story is somewhat disputed:

"According to Russell and Norvig, the above story "cannot be strictly true, because his first paper using the term (Bellman, 1952) appeared before Wilson became Secretary of Defense in 1953." Also, Harold J. Kushner stated in a speech that, "On the other hand, when I asked [Bellman] the same question, he replied that he was trying to upstage Dantzig's linear programming by adding dynamic. Perhaps both motivations were true."

random3•4h ago
I can't find the resource, but, according to Bellman he needed a name that wouldn't sound like something the army would scrape— effectively protecting his research. He had a few options and "dynamic" sounded "cool" enough, in his opinion, for army.

This said, the "programming" part is pretty accurate, but IMO the author misses the point with what "computer programming" meaning is, afterall.

mr_toad•1h ago
Linear programming gets confused with programming, but also with linear algebra and linear modelling.
nextaccountic•12m ago
Programming in this sense means just optimization
bob1029•5h ago
Dynamic optimality is a way more interesting concept to me. Data structures like the splay tree can effectively write the "ideal program" for accessing data based on the current use patterns by maintaining the data in a particular way over time. Instructions and data are mostly the same thing when you really think about it.
lblume•1h ago
Dynamic optimality sounds like a very euphemistic description of Levin's Universal Search to me – it is definitely optimal, just in a very, hmm, dynamic, let's say, way.
sureglymop•5h ago
I first learned about Dynamic Programming in the algorithms and datastructures class in the first semester at uni. But the lecturer very quickly moved on to only ever refer to it as "DP". This makes me think he may have done that as to not confuse students.

Even though I knew it wasn't referring to programming I did wonder why it was called that, interesting!

eric-burel•5h ago
The article would probably be more complete with references to similar terms "linear program" or "integer program", unless this is also a slightly different meaning?
pss314•4h ago
Richard Bellman on the Birth of Dynamic Programming (2002) [pdf] https://news.ycombinator.com/item?id=42482289
nottorp•4h ago
> Thus, I thought dynamic programming was a good name. It was something not even a Congressman could object to. So I used it as an umbrella for my activities.

That reminds me of, much later, getting my first basic programming book. In communist Romania. It was titled "Let's learn interactive microelectronics" because "programming" books were considered a waste of resources and did not get approved.

lblume•1h ago
I thought communist nations heavily relied on programs (in the author's sense) for their planned economies?
nottorp•1h ago
No, on "plans". Mostly five year plans. At least in Romanian.

The only other use of "program" I remember from back then is "tv program".

lblume•1h ago
According to [1], the optimization done in the Soviet Union (the largest planned economy) was based on linear programming.

[1]: https://www.rtsg.media/p/soviet-planning-demystified

nottorp•1h ago
But this subthread is not about what they were using but about localized names for computer programming vs other forms of programming.

Also I don't think Soviet examples help because I don't think the last few soviet dictators before the iron curtain's fall were against computers, while Ceausescu definitely was.

sigmoid10•3h ago
Looks like someone discovered this reddit post [1] and wrote a whole blog around the top answer. Since the example is copied verbatim, it might even be an LLM that was hooked up to web search.

[1] https://www.reddit.com/r/learnprogramming/comments/1ac9zbl/d...

globular-toast•3h ago
Huh, I've never realised this. It actually makes more sense if I revert to my native British spelling of programme in this sense.

A program can use dynamic programming to dynamically figure out the programme required to perform a computation.

cousin_it•3h ago
My favorite dynamic programming problem is computing the percent of relatedness between two people, given a set of people and a partial graph of parenthood among them (with the assumption that any relatedness not implied by the graph is zero). If seems very confusing at first, but then you realize that if A is not a descendant of B, rel(A,B)=(rel(A,father(B))+rel(A,mother(B)))/2, which allows you to compute all relatedness values from top to bottom of the graph as fast as possible.
julien6arzola•3h ago
Ok
DonHopkins•2h ago
James Gosling's infamous Emacs screen redisplay algorithm uses dynamic programming.

https://en.wikipedia.org/wiki/Gosling_Emacs

It computed the minimal cost path through a cost matrix of string edit operations (the costs depended i.e. on the number of characters to draw, length of the escape codes to insert/delete lines/characters, padding for slow terminals, etc).

The algorithm used is a dynamic programming one, where

   M[i,j] = MIN (M[i-1,j]+dcost,             # UP,   implies delete
                 M[i,j-1]+icost+redraw cost, # LEFT, implies ins+redraw
                 M[i-1,j-1]+rewrite cost)    # BACK, implies rewrite
Each terminal type would configure the display driver according to its supported escape codes and screen update times (some terminals were REALLY SLOW inserting and deleting lines or even scrolling, and you had to send null padding to wait for it to finish).

It's infamous both for its skull-and-crossbones warning comment [designed by Brian Reed] and correspondingly poisonous complex code, and also RMS's battle with UniPress software, incorporating it into Gnu Emacs, getting threatened by UniPress, then rewriting it from scratch.

https://features.slashdot.org/story/13/01/06/163248/richard-...

>Q: Give me your best hack. [...]

>RMS: I can't remember all the hacks that I was proud of, so I can't pick the best. But here's something I remember fondly. The last piece of Gosmacs code that I replaced was the serial terminal scrolling optimizer, a few pages of Gosling's code which was proceeded by a comment with a skull and crossbones, meaning that it was so hard to understand that it was poison. I had to replace it, but worried that the job would be hard. I found a simpler algorithm and got it to work in a few hours, producing code that was shorter, faster, clearer, and more extensible. Then I made it use the terminal commands to insert or delete multiple lines as a single operation, which made screen updating far more efficient.

There's some more discussion about it here that ended up being pretty accurate once all was said and done:

https://www.reddit.com/r/emacs/comments/bek5b2/til_emacs_was...

Emacs's infamous "Ultra-hot screen management package" with its "Skull and Crossbones" warning was definitely black magic!

The algorithm worked great and was well worth it over a 300 / 1200 / 2400 baud connection to a Vax 780 / 750 / etc, and as modems and computers got faster it was still useful, but at today's network bandwidths and cpu speeds it's thunderous overkill.

https://news.ycombinator.com/item?id=38996713

A redisplay algorithm, by James Gosling (ACM SIGPLAN Notices, April 1981):

https://donhopkins.com/home/documents/EmacsRedisplayAlgorith...

>7. Acknowledgements: The people who did the real work behind this paper are Mike Kazar, Charles Liescrson and Craig Everhart; all from CMU.

>Bibliography: 1. Kevin Q. Brown. Dynamic Programming in Computer Science. CMU, February, 1979.

That code was also used in Maryland Windows (a text based overlapping/tiled window system developed at the University of Maryland by Chris Torek, like Emacs - Text Editor + Window System, kind of like "screen" or "mux" or "mgr").

https://donhopkins.com/home/archive/emacs/mw/display.c

https://en.wikipedia.org/wiki/Dynamic_programming

https://wiki.c2.com/?DynamicProgramming

https://en.wikipedia.org/wiki/ManaGeR

https://news.ycombinator.com/item?id=22849522

>Gosling Emacs was especially noteworthy because of the effective redisplay code, which used a dynamic programming technique to solve the classical string-to-string correction problem. The algorithm was quite sophisticated; that section of the source was headed by a skull-and-crossbones in ASCII art, warning any would-be improver that even if they thought they understood how the display code worked, they probably did not.

https://donhopkins.com/home/archive/emacs/mw/display.c

    /*  1   2   3   4   ....            Each Mij represents the minumum cost of
          +---+---+---+---+-----        rearranging the first i lines to map onto
        1 |   |   |   |   |             the first j lines (the j direction
          +---+---+---+---+-----        represents the desired contents of a line,
        2 |   |  \| ^ |   |             i the current contents).  The algorithm
          +---+---\-|-+---+-----        used is a dynamic programming one, where
        3 |   | <-+Mij|   |             M[i,j] = min( M[i-1,j],
          +---+---+---+---+-----                      M[i,j-1]+redraw cost for j,2
        4 |   |   |   |   |                           M[i-1,j-1]+the cost of
          +---+---+---+---+-----                        converting line i to line j);
        . |   |   |   |   |             Line i can be converted to line j by either
        .                               just drawing j, or if they match, by moving
        .                               line i to line j (with insert/delete line)
     */
Trivia: That "Skull and Crossbones" ASCII art is originally from Brian Reid's Scribe program, and is not copyrighted.

https://donhopkins.com/home/archive/emacs/skull-and-crossbon...

                         /-------------\ 
                        /               \ 
                       /                 \ 
                      /                   \ 
                      |   XXXX     XXXX   | 
                      |   XXXX     XXXX   | 
                      |   XXX       XXX   | 
                      \         X         / 
                       --\     XXX     /-- 
                        | |    XXX    | | 
                        | |           | | 
                        | I I I I I I I | 
                        |  I I I I I I  | 
                         \             / 
                          --         -- 
                            \-------/ 
                    XXX                    XXX 
                   XXXXX                  XXXXX 
                   XXXXXXXXX         XXXXXXXXXX 
                          XXXXX   XXXXX 
                             XXXXXXX 
                          XXXXX   XXXXX 
                   XXXXXXXXX         XXXXXXXXXX 
                   XXXXX                  XXXXX 
                    XXX                    XXX 

                          ************** 
                          *  BEWARE!!  * 
                          ************** 

                        All ye who enter here: 
                    Most of the code in this module 
                       is twisted beyond belief! 

                           Tread carefully. 

                    If you think you understand it, 
                              You Don't, 
                            So Look Again.
But if you're not trying to support old terminals (but might still have a slow "thin wire" network connection), there is an orders of magnitude better approach: The Emacs NeWS display driver (for both UniPress and Gnu Emacs) downloaded PostScript code to define an efficient application specific network protocol with instantaneous local input tracking and feedback (unlike how X-Windows uses a fixed protocol, but like how AJAX uses JavaScript).

The source code to Gosling's UniPress Emacs 2.20 just recently surfaced, and the display code is well commented (still including the skull and crossbones and ascii art diagrams):

https://github.com/SimHacker/NeMACS/blob/main/src/DspVScreen...

And the lower level terminal driver layer:

https://github.com/SimHacker/NeMACS/blob/main/src/DspTrm.c

The NeWS terminal drivers I worked on for UniPress and Gnu Emacs were layered on top of that dynamic programming screen update code. But instead of sending escape codes to a dumb terminal, it downloaded PostScript code into the NeWS server to implement drawing, mouse tracking, text selection feedback, pie menus, tabbed windows, etc, and sent binary PostScript tokens back and forth (a practice now called "AJAX" for X = XML or JSON text instead of binary PostScript data and code):

TrmPS.c: https://github.com/SimHacker/NeMACS/blob/main/src/D.term/Trm...

TrmPS.cps: https://github.com/SimHacker/NeMACS/blob/main/src/D.term/Trm...

PostScript stuff for Emacs: https://github.com/SimHacker/NeMACS/tree/main/ps

Emacs 2.20 Demo (NeWS, multiple frames, tabbed windows, pie menus, hypermedia authoring):

https://www.youtube.com/watch?v=hhmU2B79EDU

Here's a brochure from February 1988 about UniPress Emacs 2.20 and "SoftWire" (NeWS without graphics, kind of like Node with PostScript instead of JavaScript).

What is Emacs: https://www.donhopkins.com/home/ties/scans/WhatIsEmacs.pdf

I also worked on the Gnu Emacs 18 NeWS display driver (supporting a single tabbed windows and pie menus in The NeWS Toolkit 2.0):

tnt.ps: https://www.donhopkins.com/home/code/emacs18/src/tnt.ps

tnt.cps: https://www.donhopkins.com/home/code/emacs18/src/tnt_cps.cps

tnt.c: https://www.donhopkins.com/home/code/emacs18/src/tnt.c

tnt-win.el: https://www.donhopkins.com/home/code/emacs18/lisp/term/tnt-w...

More on the NeWS versions of Emacs with links to code here:

https://news.ycombinator.com/item?id=26113192

amelius•2h ago
Makes sense as the most widely known algorithm for finding the minimal edit distance between two strings also relies on DP. It was invented around 1968.

https://en.wikipedia.org/wiki/Wagner%E2%80%93Fischer_algorit...

amelius•2h ago
What would we call it if it were invented today?
GuB-42•1h ago
Probably AI something.

The name "dynamic programming" was chosen to impress bureaucrats, and AI is where it is at today.

"Subtask training" maybe, as the idea is to train the algorithm on smaller tasks to to make it more efficient at solving larger tasks. Ok, it is just storing intermediate results, but it sounds more AI-ish like that.

inopinatus•1h ago
Partial/Intermediate Solution Storage
amelius•1h ago
Tabularized memoization?

Bottom-up memoization?

Iterative memoization?

qprofyeh•2h ago
Thanks for sharing. I always viewed DP as finding partials to cache. Something not done a lot in day to day algos. Did not appreciate the name as being related to execution order optimization.
butan-2-ol•2h ago
How
montebicyclelo•2h ago
I always found that badly named things did make learning harder / more jarring; especially if an explanation for the incongruous name wasn't provided.
layer8•2h ago
Similar for “linear optimization”, which doesn’t refer to program optimization.
omnibrain•1h ago
Ok, but how do you call what we are doing:

We have an interpreted language in which you can call "subroutines" by name. But you can build the name in you program "on the fly", for example out of data.

throwawayffffas•38m ago
Dynamic programming is a subfield of mathematical programming. The field of studying optimization problems.

It's called programming because programming means scheduling. And most of the problems the field initially concerned itself with were logistics scheduling problems.

Computer programming is called programming because it involves scheduling instructions.

throwawayffffas•36m ago
For all the people looking for different terminology, the word you are looking for is optimization.

Dynamic optimization makes just as much sense. Same as Mathematical optimization does for the wider field of study.

m3nu•30m ago
In strength training 'programming' means the details of your workout plan. Like how many sets, reps and exercises.
ChrisMarshallNY•21m ago
That's a really interesting origin story. I had no idea, but I also totally believe it. I've seen things, man...