EDIT: I found the quote, from chapter 6:
Psychometricians have closely questioned American scientists of this first modern generation, curious to know what kind of men they were—there were few women among them—and from what backgrounds they emerged. Small liberal arts colleges in the Middle West and on the Pacific coast, one study found, were most productive of scientists then (by contrast, New England in the same period excelled at the manufacture of lawyers).
...
Theoretical physicists averaged the highest verbal IQs among all scientists studied, clustering around 170, almost 20 percent higher than the experimentalists. Theoreticians also averaged the highest spatial IQs, experimentalists ranking second.
[1] https://worldscientific.com/worldscibooks/10.1142/q0436#t=ab...
The field probably does itself a disservice by overemphasising math. That framing can push people away who might actually do really well, especially those strong in reasoning, abstraction, or communication. Linked study is a good reminder to rethink how we present programming imo.
You can see visual reasoning as a little cheat computation, you can run math problems through your sense-determining brain, which is what brains are really good at (robots struggle with our levels of dexterity). But the fact remains that you can only visualize in low dimensions, and there are infinitely many dimensions.
Note: You can reduce many problems to 3d, but also many problems in 3d have configuration spaces with much higher dimension, so there's some nuance.
I remember when I first started working on my Master's project on wireless sensor networks, my advisor sat me down and said "I think I know a good project for you. I want you to print out the source code for TinyOS, study it for a week, and come back to me when you think you know enough to make these changes." This was a sort of formative experience for me, and ever since when joining a new project I made sure to take the time to read through the code to understand how things fit together.
Terrible at math, I hate it and feel dyslexic trying to read most mathematical writing. I excelled at it in elementary school, then quickly came to feel frustratingly stupid at it as it became less about algorithms (more on that in a bit...) and all about equations and abstract stuff with unknown applications.
However, programming was natural and easy to pick up. I've repeatedly had to take more time convincing myself I actually understand some supposedly "hard" thing, like pointers or recursion, than it took to learn them in the first place, because they were in fact very easy to understand so I kept second-guessing myself—"I must not get it, because that was easy". I've been the go-to guy for "hard" and "low-level" problems basically everywhere I've worked.
What I've noticed is that when I must read math, the only way I can get any headway is to turn everything into steps, through which some example values may pass and affect one another. I have to turn it all into algorithms. Algorithms, I can get. Attempts to express meaning through equations and proofs, no, I have to painstakingly turn every single boundary between every symbol into a step and "walk through" it to have any hope of understanding it, and once I do, this new understanding only barely illuminates the original representation.
I think programming clicked for me because, as typically encountered and taught, it's very heavy on algorithms and very light on other varieties of mathematical presentation. Plus, having so very much more context available about what variables represent and what routines do, than a jumble of letters and symbols. FFS, if we say Perl is line noise, what's mathematical writing? Straight gibberish from a brain-wrecked Cthulhu cultist? Perl's the clearest thing in the world by comparison!
... where I do run into trouble is languages with "mathy" syntax, where the idiomatic style favors single-letter variables and accomplishing-things-by-asserting-equality. I can't read Haskell to save my life. Put the same supposedly-tricky concepts (monads, type classes) in any more-ordinary language, and it's easy, but when I tried to learn them using Haskell, I couldn't get anywhere at all. Shit, it takes me forever just to understand fizzbuzz-level programs written in Haskell.
It prevented me from having a CS degree, I was unable to complete the math courses, but as far as actual programming and "software engineering" goes (design, etc) it's never hindered me. I can work out the logic and I let the computer do the math.
Edit: I'm downvoted below zero for this comment. I don't know what people are so offended by?
> It prevented me from having a CS degree, I was unable to complete the math courses, but as far as actual programming and "software engineering" goes (design, etc) it's never hindered me. I can work out the logic and I let the computer do the math.
This is what's wild to me: I have a long, successful career in a "STEM" field that's allegedly math-heavy, while being practically incapable of working with math. Like, it's never even been slightly a problem. I can't relate at all to characterizations of programming as heavy on math. It's never been my experience of it, and at this rate, probably never will be. If it were, I'd for-sure be in a different job.
I think one "verbal" skill that has served me well is fast reading. When I had to read a 300 page novel once a week you learn how to skim for key elements, which is immensively useful for getting up to speed in a new codebase/language or locating a bug.
> though 18 years in food service means I'm really quick at estimating percentages to within delta
This is an interesting comment! No trolling: Were you a bread or pastry baker? I am curious to hear more about this experience.This is me and also many of the CS students in my cohort, and AFAIK something that universities actively selected for, and students also self-selected around in an era of RTFM/MUDs/IRC before LLMs or youtube. The best programmers I've worked with are still always very linguistically brained.. polyglots even when they didn't have to be, or with a long track record of engaging with difficult literature. If nothing else.. just very witty in that certain way that's meta-cognitive, meta-linguistic.
This is still true I think but it's much harder see out in the wild due to the degree/career trajectory popularity. Plus as long we're optimizing for leet-coding even though built-from-scratch algorithms are a very rare need compared to skills with good design/exposition.. naturally the math-brain is favored.
Some schools like MIT might have required more, but on average what I wrote was about it. Has it increased since then? Based on the new hires I've seen the last decade I'd have guessed the math requirements were mostly the same.
Anyway, both computation and math are grouped under "apriori" knowledge. Any semantic distinction is ultimately silly. But we could just as easily be teaching programming as a craft in the context of the real world—I think this is closer to how it's done outside the US. I am not at all convinced the American style is what people ought to be paying for.
Yeah, I never thought this made sense, but so many people did; and, I always hear people on Slashdot talking about how programming IS math. None of that has been my personal experience, and I'm coming up on 21 years as software engineer. Discrete was the ONLY math course that I really enjoyed and did well in the first time around. For me, this always made sense.
I can count the times I've ever applied math past approximately high school algebra 1, on one hand. Period, in private life, in hobbies, at work. I'm not sure I've ever used any "college level" math, for anything at all.
I've, and other programmers I've known, gotten excited on the very few occasions anything even slightly mathematically-tricky came up, precisely because it almost never happens.
You chose CS but really wanted Software Engineering. Discrete Mathematics was all you needed for that.
It helped that Python was meant to resemble natural language. I had learned C++ and Perl before but they never stuck, because I never made the connection to language. Ironically, since Perl was designed by a linguist!
Kurt Vonnegut: See, I came up through a chemistry department.
Charlie Rose: Yeah, right.
Kurt Vonnegut: And so I wrote and there was nobody there to tell me whether it was any good or not. I was just making my soul grow, writing stories.
There's some stuff about his opinion on training for writing that could be relevant:
https://charlierose.com/videos/25437
I don't think it's fair to attribute anything to anything. Stuff comes from all over the place. In other words, attributing programming prowess to math was a mistake, and we are making the same mistake again attributing it to language.
---
Just one more:
Kurt Vonnegut: --consider himself in competition with a world's champion. And this is one reason good writers are unlikely to come from an English Department. It's because the English Department teaches you good taste too early.
I think his main point is when we put something on a pedestal, we actually limit people, whether that be math or language.
My experience is that they spit out reasonably looking solutions but then they don't even parse/compile.
They are OK to create small spinets of code and completion.
Anything past that they suck.
It's actually hilarious that AI "solved" bullshiting and and artistic fields much better and faster than say reasoning fields like math or programming.
It's the supreme irony. Even 5 years ago the status quo was saying artistic fields were completely safe from the AI apocalypse.
Just as an LLM may be good at spitting out code that looks plausible but fails to work, diffusion models are good at spitting out art that looks shiny but is lacking in any real creativity or artistic expression.
My experience with that is that artistic milieus now sometimes even explicitly admit that the difference is who created the art.
"Human that suffered and created something" => high quality art
"The exact same thing but by a machine" => soulless claptrap
It's not about the end result.
A lot could be written about this but it's completely socially unacceptable.
Whether an analogous thing will happen with beautiful mathematical proofs or physical theories remains to be seen. I for one am curious, but as far as art is concerned, in my view it's done.
This has nothing to do with whether a human or AI created the art, and I don't think it's controversial to say that AI-generated art is derivative; the models are literally trained to mimic existing artwork.
Your "creativity" is just "high temperature" novel art done by the right person/entity.
This was something already obvious to anyone paying attention. Innovation from the "wrong people" was just "sophomoric", derivative or another euphemism, but the same thing from the right person would be a work of genius.
It's like people enjoy extrapolating their surprise when it comes to LLMs, and I don't think it's very helpful.
Oddly, I also use spatial intuition when thinking about stuff like stacks and the shape of data structures.
You sure about that? How about inductive proofs?
I would just say that language is more familiar to most. Mathematics are also languages, but more formal and foreign to most.
Edit: added a definition of apriori knowledge.
Edit2: to put this another way, nobody is arguing that recursion doesn't exist. Or that it is empirically-derived. No, it's a useful construct to show certain relations.
Edit3: added a sentence
Edit4: The extent to which our own grammars are inherently recursive vs this being culture or technology is irrelevant to identifying the concept of recursion as an apriori, linguistic concept.
Edit5: i suppose you might also be referring to the idea that we naturally process recursion. I mean, we clearly, evidently do; whether or not that's inherent to being human is a separate question entirely. Hell in the free software world there's a whole recursive acronym meme that taps into some part of our brain and tickles it.
With that argument everything is fundamentally linguistic since everything is communicated using a language.
Can you come up with a more reasonable argument?
If recursion was just writing the function 10 times like you did in language then people wouldn't struggle with it.
So in this case, "recursive function" would be "clause" or something like that; I'm no linguist. But clauses can embed clauses which can embed further clauses, etc.
I think your usage of recursive functions is just high-level logic—you're describing an inductive proof. We also frame a lot of our social games as recursive processes. But these are conscious processes that we can evaluate consciously; the recursion in spoken language is largely unconscious and very shallow.
But people are constructing sentences, not grammars. When you construct a grammar you can add a recursive part to it, that is true, just like in a programming language, but constructing grammars is not what people mean with language skills.
A sentence can't be recursive since languages in themselves has no concept of applying a concept, for that you need an interpretation of the language references. For example, you can have a recursive function written in a programming language that doesn't have a recursive grammar, the concepts are different things.
1. Our spoken and especially written grammar is recursive. We do handle this unconsciously. This is not related to our ability to reason about recursion at a high level, and recursive grammars are not necessary to do so. This is not a skill in the normal sense and we have only (very) limited ability to improve our capacity to interpret deeply nested grammars. However, this is still a useful illustration of what recursion IS, which is why I brought it up.
2. Language also introduces the ability to semantically reason about recursiveness. This is still a linguistic thing—you need a symbol and relations among symbols in order for recursion to be meaningful—but this is a skill and is likely very related to linguistic skill. This is the part that really helps you to program: ultimately, you're just reasoning about symbols and looking for incoherency.
What does this mean exactly?
> What does this mean exactly?
What does this mean exactly?
- Defining recursion is linguistic
- Defining a function recursively is mathmatic
Besides, a lot of what people mean when they say they're bad at math is that they're bad at arithmetic, which is honestly understandable.
But that isn't what we mean with recursive function. We don't call this recursive:
x = x + 1
Its just incrementing x.That's not a recursive function as it's written, but you could certainly consider it a form of symbolic recursion. This just isn't a very useful characterization in an iterative/imperative context. You could frame incrementing as recursive, though—this is just peano axioms/church encoding.
I've always wondered why FP isn't more popular. I concluded it's because most folks don't like thinking like abstract math.
I think the problem-solving part of coding requires math skills, while the organization part requires writing skills. The organization part affects the problem-solving part, because if you write messy code (that you can’t reread once you forget or extend without rewriting) you’ll quickly get overwhelmed.
Writing large math proofs also requires organization skills, since you’ll refer to earlier sections of your proof and may have to modify it when you encounter issues. But to me, math seems to have more “big steps”: sudden insights that can’t be derived from writing (“how did you discover this?”), and concepts that are intrinsically complicated so one can’t really explain them no matter how well they can write. Whereas programming has more “small steps”: even someone who’s not smart (but has grit) can write an impressive program, if they write one component at a time and there aren’t too many components that rely on each other.
Just like in math.
BTW relational DBs are math.
It's funny, reading the post you're replying to, I basically read it as
> I don't need math, I need <math, but by another name>
My teenage daughter used to complain about math, and I spent some time trying to explain to her that we use math every day... EVERY day. Now, when I see her do something that was math (even if it's not obvious it was math), I say "Math... every day". I say that a lot.
Also, yes, my daughter finds me annoying. But also funny; but likely not for the math thing.
> I don't need math, I need <math, but by another name>
This seems to be how it always goes. I think we've confused a lot of people by conflating math with arithmetic.So how much of programming is maths? Before we answer that, let's answer: How much of maths is actually maths? Because first we define maths, and then we define programming based on whatever that is, but until we have that first concrete definition this discussion cannot occur.
I will add that "it is taught by the maths department in college" is a flimsy argument, and frankly one the Physics department in particular would mock.
I find this distinction useful in the abstract, that one can engage different parts of the brain for different components of development. This probably explains why a well-written DSL can be so powerful in the right context.
Firstly, computer science is math.
Secondly, I remember covering graphs in a discrete math course back when I was in college.
> What if you do it in SQL?
SQL is more-or-less a poor implementation of relational algebra. Ie, math.
> Most coding doesn't need much of any math past boolean logic and very basic set operations
Coding IS math.Not "coding uses math", I mean it is math.
Mathematicians do not deal in objects, but in relations among objects; they are free to replace some object by others so long as the relations remain unchanged. Content to them is irrelevant; they are interested in form only.
- Poincare[0]
I don't know how you code, but I don't think I'm aware of code that can't be reasonably explained as forming relationships between objects. The face we can trace a program seems to necessitate this.[0] https://philosophy.stackexchange.com/questions/22440/what-di...
But I'll refer you to a longer conversation if it helps https://news.ycombinator.com/item?id=43872687
Coding used to be very close to pure math (many early computer science classes were taught in the Math Department in universities) but it has been so far abstracted from that to the point that it is its own thing and is as close to math as any other subject is.
> By that same logic you could also say that language is math
Not quite, but the inverse is true. The language to math direction doesn't work because a lack of formalism. I can state incomprehensible sentences or words. (There's an advantage to that in some cases!) but when you do that with code you get errors and even you do it with math its just that there's no compiler or interpreter that tells at yousince you can express paradoxes with match, perhaps not that different.
The contradiction is used in proof formulation, specifically to invalidate some claim. I don't think this is what you're implying.
The latter is what it contextually sounds like you're stating; things like the Banach-Tarksi Paradox. There's no self-contradiction in that, but it is an unexpected result and points to the need to refine certain things like the ZFC set theory.
I'd also stress that there are true statements which cannot be proven through axiomatic systems. The Halting Problem is an example of what Godel proved. But that's not contradictory, even if unexpected or frustrating.
> coding and math being equivalent
Please see lambda calculus. I mean equivalent in the way mathematicians do: that we can uniquely map everything from one set to another> I mean equivalent in the way mathematicians do
That sounds like you're backing off from your original claim, probably because it is impossible to defend.
That you can use mathematics to describe code doesn't seem very different from using math to describe gravity, or the projected winner in an election, or how sound waves propagate.
Isn't the primary purpose of math to describe the world around us?
Then it shouldn't be surprising that it can also be used to describe programming.
In the real world, however, software engineering has nothing to do with mathematical abstractions 99% of the time
Though interpreting a CRUD app as a theorem (or collection of theorems) doesn’t result in an interesting theorem, and interpreting a typical theorem as a program… well, sometimes the result would be a useful program, but often it wouldn’t be.
It's not the type of thing that gets mathematicians excited, but from an engineering perspective, such theorems are great. You can often blindly code your way through things by just following the type signatures and having a vague sense of what you want to accomplish.
It's actually the halting problem that I find is not relevant to practical programming; in practice, CRUD apps are basically a trivial loop around a dispatcher into a bunch of simple functions operating on bounded data. The hard parts have been neatly tidied away into databases and operating systems (which for practical purposes, you can usually import as "axioms").
> It's not the type of thing that gets mathematicians excited
Says who? I've certainly seen mathematicians get excited about these kinds of things. Frequently they study Programming Languages and will talk your ear off about Category Theory. > You can often blindly code your way through things by just following the type signatures and having a vague sense of what you want to accomplish.
Sounds like math to me. A simple and imprecise math, but still math via Poincare's description. > in practice, CRUD apps are basically a trivial loop around a dispatcher into a bunch of simple functions operating on bounded data
In common settings. But those settings also change. You may see those uncommon settings as not practical or useful but I'd say that studying those uncommon settings is necessary for them to become practical and useful (presumably with additional benefits that the current paradigm doesn't have). > Isn't the primary purpose of math to describe the world around us?
No, that's Physics[0]. I joke that "Physics is the subset of mathematics that reflects the observable world." This is also a jab at String Theorists[1].Physicists use math, but that doesn't mean it is math. It's not the only language at their disposal nor do they use all of math.
> software engineering has nothing to do with mathematical abstractions 99% of the time
I'd argue that 100% of the time it has to do with mathematical abstractions. Please read the Poincare quote again. Take a moment to digest his meaning. Determine what an "object" means. What he means by "[content] is irrelevant" and why only form matters. I'll give you a lead: a class object isn't the only type of object in programming, nor is a type object. :)[0] Technically a specific (class of) physics, but the physics that any reasonable reader knows I'm referencing. But hey, I'll be a tad pedantic.
[1] String Theory is untestable, therefore doesn't really reflect the observable world. Even if all observable consequences could be explained through this theory it would still be indistinguishable from any other alternative theory which could do so. But we're getting too meta and this joke is rarely enjoyed outside mathematician and physicist communities.
Going on a total tangent, if you'll forgive me, and I ask purely as a curious outsider: do you think math could have ever come into being if it weren't to fill the human need of describing and categorizing the world?
What would have been the very beginning of math, the first human thought, or word or action, that could be called "math"? Are you able to picture this?
> do you think math could have ever come into being if it weren't to fill the human need of describing and categorizing the world?
I'm a bit confused. What exactly is the counterfactual[0] here? If it is hyper-specific to categorizing and describing then I think yes, those creatures could still invent math.But my confusion is because I'm having a difficult time thinking where such things aren't also necessary consequences of just being a living being in general. I cannot think of a single creature that does not also have some world model, even if that model is very poor. My cat understands physics and math, even though her understandings are quite naive (also Wittgenstein[1] is quite wrong. I can understand my cat, even if not completely and even though she has a much harder time understanding me). More naive than say the Greeks, but they were also significantly more naive than your average math undergrad and I wouldn't say the Greeks "didn't do math".
It necessitates a threshold value and I'm not sure that this is useful framing. At least until we have a mutual understanding of what threshold we're concerned with. Frankly, we often place these contrived thresholds/barriers in continuous processes. They can be helpful but they also lead to a lot of confusion.
> What would have been the very beginning of math
This too is hard to describe. Mull over the Poincare quote a bit. There's many thresholds we could pick from.I could say when the some of the Greeks got tired of arguing with people who were just pulling shit out of their asses, but that'd ignore many times other civilizations independently did the same.
I could say when the first conscious creature arose (I don't know when this was). It needed to understand itself (an object) and its relationship to others. Other creatures, other things, other... objects.
I could also say the first living creature. As I said above, even a bad world model has some understanding that there are objects and relationships between them.
I could also say it always was. But then we get into a "tree falls in a forest and no one is around to hear it" type of thing (also with the prior one). Acoustic vibrations is a fine definition, but so is "what one hears".
I'd more put the line closer to "Greeks" (and probably conscious). The reason for this is formalization, and I think this is a sufficient point where there's near universal agreement. In quotes because I'll accept any point in time that can qualify with the intended distinction, which is really hard to pin-point. I'm certainly not a historian nor remotely qualified to point to a reasonable time lol. But this also seems to be a point in history often referenced as being near "the birth" and frankly I'm more interested in other questions/topics than really getting to the bottom of this one. It also seems unprovable, and I'm okay with that. I'm not so certain it matters when that happened.
To clarify, I do not think life itself necessitates this type of formalization though. I'm unsure what conditions are necessary for this to happen (as an ML researcher I am concerned with this question though), but it does seem the be a natural consequence of a sufficient level of intelligence.
I'll put it this way, if we meet an alien creature I would be astonished if they did not have math. I have no reason to believe that their math would look remotely similar to ours, and I do think there would be difficulties in communicating, but if we both understand Poincare's meaning then it'll surely make that process easier.
Sorry, I know that was long and probably confusing. I just don't have a great answer. Certainly I don't know the answer either. So all I can give are some of my thoughts.
[0] https://www.inference.vc/causal-inference-3-counterfactuals/
Math is about abstractions and relations. See the Poincare quote again.
Plus, the Programming Languages people would like to have a word with you. Two actually: Category Theory. But really, if you get them started they won't shut up. That's either a great time or a terrible time, but I think for most it is the latter.
Programming is an expression of logic, which is absolutely mathematics.
But then we also have to think about naming variables and classes, structuring our code so that it is more readable by other developers, and so on. That's less about formal reasoning and more about communication.
There is an engineering aspect to programming (prototyping, architecture, optimization, etc). It's a combination of mathematics and engineering. Software Engineering.
Structuring sentences and naming variables so that it is easier for other people to understand is less about formal mathematical reasoning, and more about communication.
You could name a variable x, y, or Banana, but it doesn't change the logic.
I mean the reason we get mad at this is because it is someone destroying "society" in some sense. Even if that society is your team or just the set of programmers. It would be a pretty dick move were I to just use a word that significantly diverged from conventional meaning and expected you to mull it over. Similarly if I drop a unknown out of context math equation. It would be meaningless.
And I'm on your side, really! I strongly advocate for documenting. And let's be real, the conclusion of your argument more strongly argues for documentation than good variable names. Because variable names are much more constrained and much more easily misinterpreted considering how any word has multiple definitions. Surrounding code is often insufficient to derive necessary contextualization.
Mathematicians do have to deal with difficulties in naming things.
> They just use i, x, etc. all over the place
I do agree with your point btw, but I did want to note that there are good conventions around symbols. The brevity is heavily influenced by the medium. Variable names sucked when you had punch cards. It's still burdensome to write long names when using paper, chalkboard, whiteboard, or any system that doesn't have autocomplete.In general, lower case letters are used as constants, excluding x,y,z,t,i,j,k (sometimes u,v,w). It isn't a hard rule, but strong preference to begin at the beginning of the alphabet for these. Capital letters usually are held for things like Variable Sets (like random variables). Greek letters need context for constants or variables. BB and Cal typefaces for sets (e.g. Real Numbers, Integers). And much more.
I think a lot of the difficulty in it is that these "rules" or patterns are generally learned through usage and often not explicitly stated. But learning them can really help read unfamiliar topics and is why "notation abuse" leads to confusion. But after all, math is all about abstraction so technically any symbol will do, but no doubt some are (significantly) better than others for communicating.
There are two hard things in Computer Science:
- Cache Invalidation
- Naming Things
- Off-by-One Errors
and also philosophy.
> Not "coding uses math", I mean it is math.
Arguably writing a novel is math, if you use the right definition of math. But sometimes its more helpful to use more informal definitions that capture what people mean then what is technically accurate.
No, not always. Quite a lot of high-level code doesn't require any math at all. It doesn't take math to perform CRUD operations, which account for a lot of programming work. Sure, the underlying machine code is all based on math, but the higher level programming doesn't need to involve a single math equation for it to be useful. Let's see where the goalposts move now...
But the CRUD logic is so basic and boring, so obvious, that it doesn't require any thought.
Which code? The machine code that underlies everything? Or the lines of simple high-level CRUD that don't even need a single number or logic statement to function? Not all programming has to be mathematical or even logical at a high enough level, and I say this as someone who's been coding assembly language for 40 years.
Those lines ARE mathematical logical statements.
Each line defines logical operations that are executed by the computer.
Same for high level or low level programming. It's all logic.
> It doesn't take math to perform CRUD operation
Yes it does. Just because the objects you are working with aren't numbers doesn't mean it isn't math. In fact, that's my entire point. It is why I quoted Poincare in the first place. He didn't say "numbers" he said "objects".In a typical implementation these are database operations. That involves relational algebra operations, state transitions, boolean logic.
The READ part can be a very complex SQL query (composed of many algebraic operations) but even the simplest query (SELECT * from t where ID = 1) is filtering a set based on a predicate. That is mathematics.
No one is moving goalposts. Set theory and logic are at the foundations of mathematics.
In math are highly concerned with structures and abstraction. We have things called operators. They aren't just addition and multiplication. We also use those words to describe different operations. They have things like groups, rings, fields, and algebras. Yes, plural.
The purpose of these things is to create logical frameworks. It matters not what the operators are. Nor does it matter what objects we operate on. Poincaré is explicitly saying this.
The part you're not understanding is the abstraction. This is what math is about. It is also why the Programming Language people are deeper in the math side and love Category Theory (I have a few friends who's PL dissertations are more math heavy than some math dissertations I've seen). It's no surprise. What's a function? How do you abstract functions? How do you define structure? These are shared critical questions. PL people are more concerned with types but I wouldn't say that makes them any less of a mathematician than a set theorist.
We can perfectly describe the CRUD operations with set theory. Do most programmers need concern themselves with this? Absolutely not. But is it something people designing those operations and systems is thinking about? Yes.
I'd encourage you to learn some set theory, abstract algebra, and maybe a bit of cat theory. It'll make these things pretty clear. But I'd caution about having strong opinions on topics you aren't intimately familiar with. Especially when arguing with those that are. Frankly, CRUD is a terrible example. I'm confident a quick google search (or asking a GPT) would quickly point you to relational algebra. It's discussed in most database classes. It certainly was in the one I was an assistant for.
This is vague and doesn't mean anything. People can't even agree what 'objects' are and people did a lot of programming before the term 'object' was invented.
Programming is about is fundamentally about instructions and data. Yin and yang of two things that are completely different, even if people can occasionally mix them and conflate them.
I'm surprised I hit a nerve with so many people. I'm quoting someone who's considered one of the greatest mathematicians. Obviously I don't know the backgrounds of people but it seems like programmers have strong opinions on what math is that disagrees with what mathematicians say they do.
... In my experience, learning to write one component at a time (and try the code, and make sure it works before proceeding) is itself a skill that many struggle to develop. Similarly for avoiding unnecessary dependencies between components. Oh, and also being able to analyze the problem and identify separable components.
One of the most frustrating things about teaching programming, for me, is the constant insistence from other teachers that you have to maintain an "absolutely everyone can learn to program" attitude at all times. Many people who start to learn programming have misguided or confused reasons for doing so and - to say the least - could make much more effective use of their time developing other skills. (It's not a question of elitism; I'd surely flounder at some tasks that others find natural.)
I very much think many people could learn the more advanced Excel Formulas, Power Automate and even simple Bash/PowerShell scripting to make their work more effective. I've met quite a few folks who had been intimidated out of trying who could do it.
On the other hand, how many people on this site could bootstrap a linux kernel on either very new or very old hardware? I know there are some, but they are certainly not the majority. I certainly won't be the first person to get linux and doom to run on a quantum computer.
But that is similar to other professions. Everyone with a largely functioning body can learn to turn a few planks and some metal parts into a functional shed door with some basic tools or to put up a decent brick wall that won't topple over in a month.
That doesn't mean everyone is qualified to pour concrete for a dam or a bridge foundation, or to re-do some historical work in original style.
It's shocking how little physical and spatial ability some people have - that is definitely not true. Sometimes it might be a personal discount or lack of confidence, but this remains true regardless of the cause.
> That doesn't mean everyone is qualified to pour concrete for a dam or a bridge foundation, or to re-do some historical work in original style.
Exactly!
I think statements like that are more concerned with philosophy than reality. Any discussion surrounding topics like this typically ends up being a discussion around definitions.
I believe the vast majority of human beings are capable of learning how to program in the most extreme elementary sense of the word. As in, outside of severe disabilities or complete and utter inaccessibility to circumstances in which one could learn program, then I think the remaining population of people could learn to program to some degree. Obviously, not everyone will learn to program due to a near infinite number of reasons.
I would argue it's like music. Anyone can make 'music.' Just make a sound -- any sound. The difference between noise and music is subjective. I would not argue that everyone could be the next Lovelace, Turning, Ritchie, Thompson, Torvalds, etc..
Now, for my jaded opinion, I think a lot of the "everyone can learn to program" talk does not come from a place of desire to share the gift of knowledge and joy of programming. I think it's more of a subtle way to encourage people to go into programming so that they may be hired by mega corps. In order to keep the Capitalist machine running. It's like the National Hockey League's slogan, "Hockey is for everyone." That is just a fancy way of saying, "everyone's money can be spent on the NHL."
I might be capable of learning advanced accounting, but that sounds like torture to me and I'll be damned if I'll ever take that on! I'm sure programming feels like that to a wide variety of people, and I don't see any need for us to try to pretend otherwise - outside of a bizarre ideological desire for equivalent outcomes from disparate groups.
I'm sure everyone is capable of learning some basic level of programming, just as they are able to learn a basic (high school) level of any subject. However, not everyone is going to have the aptitude to take that to an advanced professional level, however hard they try. We're not all cut out to be artists, or writers, or doctors, or scientists, or developers, etc.
Personally I've always considered a solid grasp of algebra to be the minimum bar for being able to program, at least for anything that isn't utterly trivial. Being able to take a word problem and turn it into a system of equations and solve it is a pretty close analog to being able to take some sort of problem or business requirement and turn it into code.
And the sad truth is that a huge percentage of the population struggle with just arithmetic, let alone algebra.
After collecting data for a few semesters he concluded his students could be clearly divided into three categories: those who just "got" programming, those who understood it after working hard, and a small group that just didn't grasp regardless of effort.
Computation is following an algorithm. e.g. long division or computing a derivative.
Intuition, AKA brilliance, is finding a non-obvious solution. Think "solving an NP problem without brute force"*. e.g. solving an integral (in a form that hasn't already been memorized) or discovering an interesting proof.
Organization is recording information in a way that a) is easy for you to recall later on (and get insights from) and b) is digestible by others**. e.g. explaining how to compute a derivative, solve an integral, or anything else.
Math, programming, and writing each require all skills. The kind of math taught in school (e.g. long division) and your boring jobs are primarily computation. I believe advanced math (e.g. calculus) is primarily intuition; it requires some organization because big theories are broken into smaller steps, but seems to mostly involve smart people "banging their head against the wall" to solve problems that are still quite unclear***. Programming is primarily organization. It requires some intuition (I think this is why some people seemingly can't learn to code), but in contrast to math, most programs can be broken into many relatively-simple features. IMO implementing all the features and interactions between them without creating a buggy, verbose, and unmaintainable codebase is programming's real challenge. Writing is also primarily organization, but finding interesting ideas requires intuition, and worldbuilding requires computation (even in fiction, there must be some coherence or people won't like your work).
> Some people are really good at mucking around garbage code (they have no choice, they get paid to), but what part of programming did they get good at? Obviously, some part of it, but nothing to write home about.
I agree that work you find boring should be avoided, and I also try to avoid working with it. But some people really seem to like working on esoteric code, and I think there are some skills (beyond computation) developed from it, that even apply when working with good code. Building a mental model of a spaghetti codebase involves organization, and if the codebase uses "genius hacks", intuition. Moreover, the same techniques to discern that two code segments in completely different locations are tightly coupled, may also discern that two seemingly-separate ideas have some connection, leading to an "intuitive" discovery. There's an MIT lecture somewhere that describes how a smart student found interesting work in a factory, and I think ended up optimizing the factory; the lesson was that you can gain some amount of knowledge and growth from pretty much any experience, and sometimes there's a lot of opportunity where you'd least expect it.
* Or maybe it is just brute force but people with this skill ("geniuses") do it very fast.
** These are kind of two separate skills but they're similar. Moreover, b) is more important because it's necessary for problems too large for one person to solve, and it implies a).
*** And whatever method solves these problems doesn't seem to be simplification, because many theories and proofs were initially written down very obtuse, then simplified later.
Usually even deciding what the problem is is in part an art, requires an act of narrativization, to shape and form concepts of origin, movement, and destination.
A good problem solver has a very wide range of abstract ideas and concepts and concrete tools they can use to model and explain problem, solution, & destination. Sometimes raw computational intellect can arrive at stunningly good proposals, can see brilliant paths through. But more often, my gut tells me it's about having a breadth of exposure, to different techniques and tools, and being someone who can both see a vast number of ways to tackle a situation, and being able to see tradeoffs in approaches, being able to weight long and short term impacts.
> Its very rare imo that computational problems emerge fully formed & ready to be tackled like proofs.
In my generation, the perfect example is Python's Timsort. It is an modest improvement upon prior sorting algorithms, but it has come to dominate. And, frankly, in terms of computer science history, it was discovered very late. The paper was written in 1993, but the first major, high-impact open source implementation was not written until 2003. Ref: https://en.wikipedia.org/wiki/TimsortIt has been reimplemented in a wide variety of languages today. I look forward to the next iteration: WolfgangSort or FatimaSort or XiaomiSort or whatever.
"Good code" is very subjective. Even readability and modularity can be taken too far.
> concepts that are intrinsically complicated,
I'm not a mathematician, but I figure mathematicians aim for clean, composable abstractions the same way programmers do. Something complicated, not just complex in its interactions with other things, seems more useful as a bespoke tool (e.g. in a proof) than as a general purpose object?
> Whereas programming has more “small steps”: even someone who’s not smart (but has grit) can write an impressive program, if they write one component at a time and there aren’t too many components that rely on each other.
This is well put. I often wonder if a merely average working memory might be a benefit to (or at least may place a lower bound on the output quality of) a programmer tasked with writing maintainable code. You cannot possibly deliver working spaghetti if you can't recall what you wrote three minutes ago.
This is a baldly self-serving hypothesis.
Forth programmers make a similar point. Forth is stack based; you typically use stack operations rather than local variables. This is ok when your 'words' (analogous to functions/procedures) have short and simple definitions, but code can quickly become unreadable if they don't. In this way, the language strongly nudges the programmer toward developing composable words with simple definitions.
(Of course, Forth sees little use today, and it hasn't won over the masses with its approach, but the broader point stands.)
Strongly disagree.
There are plenty of cases where non-modular code is more readable and faster than modular code (littered, presumably, with invocations of modular logic).
There are also countless cases, particularly in low-level languages or languages with manual memory management, where the best solution -- or the correct solution -- is far from readable.
Readability is anyways in the eye of the beholder (My code is readable. Isn't yours?) and takes a back seat to formal requirements.
Just as a matter of personal style, I prefer long, well-organized functions that are minimally modular. In my experience, context changes -- file boundaries, new function contexts, etc -- are the single greatest source of complexity and bugs, once one has shipped a piece of code and needs to maintain it or build on it. Modular code, by multiplying those, tends to obscure complexity and augment organizational problems by making them harder to reason about and fix. Longer functions certainty feel less readable initially but I'd wager they produce better, clearer mental models, leading to better solutions.
The study itself claims:
- fluid reasoning and working-memory capacity explained 34%
- language aptitude (17%)
- resting-state EEG power in beta and low-gamma bands (10%)
- numeracy (2%)
They take math skills to equal numeracy. The study itself implies this too. I disagree on a fundamental level with that. Math skills align much more closely to fluid reasoning than to numeracy.
In small peer groups (“pods”) that debug and learn together, communication becomes a core skill—and that can actually change how math skills are applied and developed. Language doesn’t just support learning; it reshapes the process.
This has the effect of making programming easier, but don't confuse it.
English majors.
It'd be interesting to see correlations (language brain vs math brain) for how easy or hard it is for people to solve new problems with language after they already know the basics.
Because, if you do what we do, it's obvious that language > math for most of this stuff.
I'd go so far as to say that music > math for programming.
Code written by programmers with humanities backgrounds is easily identifiable as being of bad quality.
Kind of like vibe coding meets programming by accident, before vibe coding was really a term.
It's a faux pas to even mention this IRL, but good coders know what's up.
Those "coders" usually get promoted to people managers, which is usually what they want anyway because their self-worth relies on abusing others to mitigate the correct self-perception they have of being "inferior".
The problem is, things need to be solved and vibe+accident programming can only go so far.
But fear not, they can always scapegoat whoever solves the problems, because if they were not to blame, how could they know what was up or even feel the need to correct it?
Even if many of the bad coders are those who were in humanities and don't have coding experience because they just entered the field (because once you get it you are no longer a humanities background)
As a new computer science professor at a community college, this is a timely article for me that may end up influencing how I teach, especially my introductory programming course as well as my discrete mathematics course.
That side, I wonder if early programming was much more math heavy, and higher level languages have successively reduced that need over time.
Mathematics itself is a human-made formal language that can be bootstrapped from definitions and axioms of logics and set theory, which have to be given in human language first.
Experienced mathematicians read formal theorems written in Greek letters off their blackboards as if it was normal English, suggesting they think about it like it was just normal English. This is not to say they cannot also view in front of their mental eye visual representations isomorphic with that language if they chose to.
It's an actively stupid fiction for people who don't understand what math is.
This comes off quite judgmental, and doesn't help me understand your actual point. Could you elaborate on the differences, as you see them?
In some sense, an undergraduate math education is akin to learning the “standard library” (in the software engineering sense) of higher mathematics. Most courses start with basic abstractions of some mathematical object and repeatedly construct more and more abstractions on top of those. The structure of those abstractions is similar to how you might build a library. A professional mathematician is expected to be fluent in the mathematical standard library, just like how you might expect an experienced software engineer to be fluent in Python’s standard library.
If this analogy is true, people who can learn Python relatively quickly might be able to also learn higher mathematics relatively quickly under the right pedagogical environment.
This was kind of how math classes worked, but without that explicit phrasing. It would certainly make the analogy between the two activities more obvious. I also wonder whether people would have less trouble with quantifiers if they were phrased in programming terms: a proof of "forall x, p(x)" is a function x=>p(x), and a proof of "there exists x such that p(x)" is a pair (x, p(x)). e.g.
Continuity: (epsilon: R, h: epsilon>0, x_0: R) => (delta: R, h2: (x: R, h3: d(x,x_0) < delta) => d(f(x),f(x_0)) < epsilon)
Uniform continuity: (epsilon: R, h: epsilon>0) => (delta: R, h2: (x: R, x_0: R, h3: d(x,x_0) < delta => d(f(x),f(x_0)) < epsilon))
proof of UC => C = (epsilon, h, x_0) => let delta, h2 = UC(epsilon,h) in (delta, h2(_,x_0,_))
So when you're trying to figure out how to do the proof, it's clear what kind of type you need to return and your IDE could help you with autocomplete based on type inference.
The comment below/above sort of explains it, except I'd argue covering it starts/can start/should start much earlier than at university. Basically, the vast majority of most branches of math, especially pure math, involves very little in the way of calculation. And the way math gets taught (and stupid claims about "math brain") means that many kids who aren't in the top few % of "doing calculation" never get to do the classes where it's less important.
LLMs, trained on words/tokens and symbols and logical combinations of them, are proving to be good at math, and bad at calculation/arithmetic. If an LLM went to school we'd never let it train on real math tokens. It would get shoved in the corner as a "model bad at math" because it was a "model bad at arithmetic".
Poetry is the art of giving different names to the same thing. Mathematics is the art of giving the same name to different things. (Henri Poincaré)
It's basically being good at "naming things", in particular abstractions.
Did Robert Frost say something like, "all language is poetry?" A chair is the term I learned to describe the object I am currently sitting on while typing this comment. Germans might call the same object a 'Stuhl', the French might call it a 'chaise', etc.. My point being that the object isn't technically any of those words, but rather those words symbolically map to that object.
Tell that to the people who designed the study, the people that approved the study, and the people that funded the study.
At the end of the day, the study set out certain criteria for assessing certain skill types/competencies and divided the people by those well defined criteria. I think it’s pretty hard to argue against the idea that people might have at least some level of aptitude for different types of activity and skill.
It leads to different styles of thinking and problem solving.
I think most people would agree that problem solving, expressing ideas verbally, and expressing ideas in math are very different skills.
A good place to look is after brain trauma. Where people end up with weird missing functionality.
Oliver Sacks writes about some of the weird side effects of brain trauma.
We could experiment on mathematicians or writers by causing trauma to different parts of the brain...
Or neural stimulation/anaesthetic of brain regions might show something.
Which... is what math is, too.
First red flag is here. The title rewrote this to be language only. That problem solving skills are relevant is pretty obvious, but language less so.
I’ve been programming for most of my life and I don’t consider myself a very good speaker. My language skills are passable. And learning new languages? Forget it. So I’m skeptical. Let’s look at the study.
First of all, “math” becomes “numeracy”. But I think programming is probably closer to algebra, but even then it’s less strict and easier to debug.
> Assessed using a Rasch-Based Nuemracy Scale which was created by evaluating 18 numeracy questions across multiple measures and determining the 8 most predictive items.
Also, the whole thing is 5 years old now.
To me, problem solving ability is precisely the same as the ability to articulate the problem and a solution. I don't see a major difference.
If you can solve a problem but you can't articulate what the problem is or why the solution will address it, I wouldn't call you a good problem solver. If you can articulate the problem well but not come up with a solution, you're already doing better than a lot of programmers in the world, and I'd probably prefer working with you over someone who presents the solution without "showing their work".
In fact, what is problem solving without such articulation? It's hard to even grasp what the skill means in a raw sense. Arguably creativity in this context is just the ability to reframe a problem in a more approachable manner. Many times, if not most times, such framing implies some obvious solution or sets of solutions with clear tradeoffs.
If you’re debugging, you can get by for a long time by trying things until the compiler shuts up. It’s not efficient or good but people do it.
I have a very difficult time trying to extract the difference between "linguistic ability" and "critical thinking", though:
1. The core difference between "critical thinking" and "uncritical thinking" is the ability to discern incoherency from coherency.
2. Coherency is evaluated at the linguistic level: do the terms bind in meaningful ways to the problem? Do any statements contradict each other?
3. The remaining aspect is "creativity": can you come up with novel ways of approaching the problem? This is the hardest to tie to linguistic ability because it sort of exists outside our ability to operate within an agreed context.
So while I agree these are distinct skills, I still have difficulty identifying what remains in "critical thinking" after linguistic ability is addressed.
The recruiter labels an algorithm problem as a "coding" test, but it's a math test, and concludes that most applicants who claim to be fluent in a programming language can't code and must have lied on their resume.
For context, I don't mind algorithm tests, but I strongly disagree with recruiters presenting it as a coding assessment.
Even if CS is sort of applied mathematics.
Math isn't about calculations/computations, it is about patterns. You get to algebra and think "what are these letters doing in my math" but once you get further you think "what are these numbers doing in my math?"
A great tragedy we have in math education is that we focus so much on calculation. There's tons of useful subjects that are only taught once people get to an undergraduate math degree or grad school despite being understandable by children. The basics of things like group theory, combinatorics, graphs, set theory, category theory, etc. All of these also have herculean levels of depth, but there's plenty of things that formalize our way of thinking yet are easily understandable by children. If you want to see an example, I recommend Visual Group Theory[0]. Math is all about abstraction and for some reason we reserve that till "late in the game". But I can certainly say that getting this stuff accelerates learning and has a profound effect on the way I think. Though an important component of that is ensuring that you really take to heart the abstraction, not getting in your own way by thinking these tools only apply in very specific applications. A lot of people struggle with word problems, but even though they might involve silly situations like having a cousin named Throckmorton or him wanting to buy 500 watermelons, they really are part of that connection from math to reality.
This is why "advanced" math accelerating my learning, because that "level" of math is about teaching you abstractions. Ways to think. These are tremendously helpful even if you do not end up writing down equations. Because, math isn't really about writing down equations. But we do it because it sure helps, especially when shit gets complicated.
[0] https://www.youtube.com/watch?v=UwTQdOop-nU&list=PLwV-9DG53N...
I mean is it any surprise kids get bored in math? They spend years learning the same thing. You spend years learning addition and multiplication (subtraction and division are the same operators). I sure as hell hated math as a kid. Especially doesn't help that it is always taught by someone who's also dispassionate about the subject. You really can get a process going where most middle schoolers are doing calculus and linear algebra (and advance ones are doing this in elementary). It isn't as far fetched as many would believe.
Indeed. Kids did not evolve to learn things because they would be useful years down the road. But they did evolve to inhale knowledge at very high rates wherever it extends their capabilities in the moment.
My take on this is that math should be tied to kids crafting, experimenting, and learned directly in the context of its fun and creative uses.
Geometry screams out to be developed within a context of design, crafting, art and physical puzzles. Algebra, trigonometry, calculus ... they all have direct (and fun) uses, especially at the introduction stage.
The availability for graphics software, sim worlds, 3D printing, etc. should make math, from the simplest to most advanced, more fun and immediately applicable than ever.
Boredom with math is a failure to design education for human beings.
(Then there is the worst crime of all - moving kids through math in cohorts, pushing individuals both faster and slower than they are able to absorb it. Profound failure by design.)
> Indeed. Kids did not evolve to learn things because they would be useful years down the road. But they did evolve to inhale knowledge at very high rates wherever it extends their capabilities in the moment.
I strongly disagree with this. Children play. Most animals play. It is not hard to see how the skills learned through play lead to future utility despite potentially none at the time. We invent new games, new rules, and constantly imagine new worlds. The skills gained from these often have no utility beyond that that is self-constructed. Though many do have future rewards. I think we often miss those though because that path is through generalization. Learning how to throw a ball can help with learning physics, writing, driving, and much more. They don't perfectly transfer but it'd be naive to conclude that there aren't overlaps.I think the truth is that as long as we are unable to predict the future, we really can't predict what skills will be useful and what specific knowledge should be pursued. We can do this in a more abstract sense as we can better predict the near future than far but that also means we should learn creativity and abstraction, as these allow us to adapt to changes. And that is why I believe we evolved these methods, and why you see things like play in most creatures.
But if you're willing to hunt, I know that this idea was attempted before[0]. France and USSR had better success than the US. I'm sure there are still people working in this direction. I don't have children, but fwiw I've taught my nieces and nephews algebra and even some of calculus before they were 10 just in visiting time during vacations. They were bored and it seemed more fun than talking about the drama, politics, and religion that the rest of my family likes to spend most of their time on. Kids were similarly disinterested in that stuff lol. I've also seen my god{son,daughter} be able to learn these types of skills, so I'm highly confident it is doable.
I was never told, for example, that matrices are a useful abstraction (shorthand?) for representing linear equations.
Or that imaginary numbers were an invented abstraction that made certain calculations easier.
It would have been nice to learn it from the perspective of how those abstractions came into being and why they're useful.
The patterns of repetitions of single things, repetitions over different things, repetitions between different things, repetitions over repetitions, ...
Natural/counting numbers are just the simplest patterns of repetition.
Congratulations on overcoming the (very weird) choice of whoever taught your intro to linear algebra class.
Regardless, it's a shockingly common occurrence. I'd agree it's not the right way, but it is also a common way. That perspective might define those priors
But yes, I am expressing honest admiration—it was a bad move on the part of the teacher I think, which the poster seems to have overcome!
I went through the same thing, it really isn't easy. But it is also why I don't blame others for not seeing it. It would be hypocritical to do so. I'm just not sure what's the best way to share, I'm open to ideas. Unfortunately we have to contend with priors that we see here, though I'm happy if my efforts even make one more person able to share in this beauty.
But nobody even told us why! And at that time I never thought to ask.
I'm not going to sit here and act like I was a star student or anything. I was more of a class clown type. I absolutely hate math and all things math. That was, until I went to college. A switch flipped when I was in a Calculus II class.
Our professor asked, "What is a 100 divided by 0?" People in the class responded with, "You can't divide by 0 because it's undefined." To which our professor responded with, "Why?" Then a student took out a calculator and showed the professor that the answer as indeed undefined. To which he responded with, "Ok, how do you that calculator is correct? Do you just believe it because people told you that dividing by zero is undefined? Ok, the answer is undefined... but why?"
Right then and there, a switch flipped in my brain. I realized that I was basically institutionalized to just not question math and to just accept that things work a certain way "just because." It was at that point I actually started to become interested in math, and it completely changed my outlook on math in a positive manner. I love math now (despite being horrible at it).
> I’ve always been a straight-A student.
> if something was not written in a book I could not invent it unless it was a rather useless variation of a known theory. More annoyingly, I found it very hard to challenge the status-quo, to question what I had learned.
I don't think this part is isolated to math, but there's a lot of acceptance for "because" being an answer. Being in an authoritative position and being challenged can be frustrating, but I think a lot of that frustration is self-generated. We stifle creativity when young and it should be no surprise that frequently when people challenge, they don't have the tools to do so (or be receptive) effectively. Truthfully, "I don't know"[1] still shuts down the conversation.But I think people themselves are uncomfortable with not knowing. I know I am! But that isn't a feeling of shame, it is a feeling that creates drive.
[0] https://thomwolf.io/blog/scientific-ai.html
[1] Alternatively, variations like: "That's a good question, I don't know" or "Great question, but we don't have the tools to address that yet". The last one need not undermine your authority either. Truthfully, the common responses resulted in me becoming strongly anti-authoritarian. Which, also has greatly benefited my career as a researcher. So, thanks lol
I have the same experience as you did, in that studying by myself abstract algebra accelerated my learning and reasoning skills.
I don't think this experience is uncommon (among people who get to these levels). Which is why it is really sad. Especially given how linear algebra and abstract algebra are in a lot of ways easier than calculus. I also think they should be taught earlier purely due to the fact that they teach abstraction and reasoning.
Language also concerns form. Grammar has form. Concepts are forms.
Math is language. 'Everything' is language. Language is the image of reality.
In the beginning was the Logos...
> Math is language.
Facts. I think it is hard to disagree with this, and it seems like a rare opinion to be held by people at high levels. > Language is the image of reality.
An image?[0] ;) > Math major here
Forgive me if I doubt, but your comment would strongly suggest otherwise, along with this one[0].The reason for doubt is a failure in fairly basic logic. Your claim is:
¬(A ↦ B) ⟹ ¬(B ↦ A)
I'd expect anyone willing to claim the title of "math major" is aware of structures other than bijections and isomorphisms. The mapping operator doesn't form an abelian group.This is not number theory, not set theory, just logic. I don't see how the properties of Abelian groups applies. I do suspect that non-abelian groups is a case where "¬(A ↦ B) ⟹ ¬(B ↦ A)" is false, and would be a contadiction to my counter claim if "¬(A ↦ B) ⟹ ¬(B ↦ A)" was my counter-claim. No, my counter claim is simply that because "B != A" therefore "A != B"
The statement "¬(A ↦ B) ⟹ ¬(B ↦ A)" is quite different, and I'm not 100% sure whether you are mixing set and logic notations? [0][1] It does seem you are bringing in number theory concepts, or we are misunderstanding one another?
I assume "↦" is the set mapping operator and "⟹" is logical implies, and "¬" is logical not. "¬(A ↦ B) ⟹ ¬(B ↦ A)" could potentially be phrased as: "If the element A cannot be mapped to B, then the element B cannot be mapped to A". I would agree that statement is not 'generally' true. Could you please clarify so we are not talking past each other.
[0] Per google: The symbol "↦" is a mapping symbol, typically used to indicate a function or relationship where one element maps to another. It's not a standard logical operator in the same way that symbols like &&, ||, or ¬ are. Instead, it represents a directional relationship between sets or elements, often seen in set theory and mathematical notations
Try that in a philosophy class and you can expect an F.
A math class too.
The simple refutation was: because B != A, therefore A != B
That WAS the math exam!! No group or set theory needed. It's simple and this is not interesting.
You say that like you think it's a qualification?
You're not making it look good. A common use of be is to express set membership rather than identity. For example, "two is an even number" or "tigers are cats".
We may hope that one day you'll come to realize that set membership is not reflexive, and - more to the point - also not symmetric.
If we do want to talk sets, that seems far more interesting. The statements like "Math is a language", or "Math has equivalence classes within languages", or "Mathematics are a Language" are slightly more interesting to consider IMO.
>> Math major here.
> You say that like you think it's a qualification?
Agree, appeal to authority fallacy. I take that over mis-framing any day though.
Any old-timers might appreciate that we're arguing over the meaning of the word "is" =D
The conclusion of the study was that linguistic aptitude seemed to be more correlated with programming aptitude than mathematical aptitude, which seems fairly interesting, and also fairly unconcerned with which specific physical regions in the brain might happen to be involved.
> The conclusion of the study was that linguistic aptitude seemed to be more correlated with programming aptitude than mathematical aptitude
And this is what I'm pushing back against and where I think you've misinterpreted. > They found that how well students learned Python was mostly explained by general cognitive abilities (problem solving and working memory), while how quickly they learned was explained by both general cognitive skills and language aptitude.
I made the claim that these are in fact math skills, but most people confuse with arithmetic. Math is a language. It is a language we created to help with abstraction. Code is math. There's no question about this. Go look into lambda calculus and the Church-Turing Thesis. There is much more in this direction too. And of course, we should have a clear connection to connect it all if you're able to see some abstraction.Language is not math, therefore math is not language.
There is no problem with A -> B ∧ B -/-> A
Here's an example. "I live in San Francisco" would imply "I live in the US". But "I live in the US" does not mean "I live in San Francisco".
Here's a more formal representation of this: https://en.wikipedia.org/wiki/Bijection,_injection_and_surje...
The statement "Math is Language", where A is Math and B is Language, maps to the logical assertion: "A = B".
If we are going to really be kinda twisty and non-standard, we could interpret the english "is" to be "is an equivalence class of". Which would map to your example pretty well: language is indeed an equivalence class of math, but math is not an equivalence class of language. Though, nobody is talking about implies operator or equivalence class here.. It's a "is" relationship, logical *equals*
From my experience, my ability to articulate myself well is bound up with my ability to abstract and detect patterns. It is the same thing I apply to crafting software, the same thing I apply to creating visual art.
I think high-cognitive-ability people segregating themselves into artsy vs mathy people has more to do with their experiences in their formative years.
At the same time, the study excluded "five participants ... due to attrition (not completing the training sessions), and one participant ... because he was an extreme outlier in learning rate (>3 sd away from the mean)." I mean, if you are to exclude 15% of your subjects without looking at their aptitude (maybe they didn't do it because it was too hard to pass the training tests to move to the next lesson, yet their language aptitude is high?), with only 36 subjects of which 21 are female (it's obvious programming is male dominated, so they only had 15 males: maybe it doesn't matter, but maybe it does), how can you claim any statistical significance with such small numbers?
> the original study uses "numeracy"
That's fair, although I don't think it changes my response. And the article still really leads to the wrong conclusions. You want to teach children abstraction and reasoning? You teach them math. Not numeracy, math.Ultimately, I believe basic algebra and geometry are the most important takeaways from math classes for most people.
[1]: https://www.hs.fi/tiede/art-2000004823594.html (sorry, it's in Finnish and behind a paywall)
As it was explained to me, one wouldn't take a "Calculus I" class as a prerequisite for say an entry-level engineering course. One typically had such a strong foundation of algebra, that when encountering a problem that required calculus, the student would just learn the necessary calculus at that point in time. In other words, with such a strong algebraic background, other aspects of math, within reason, were much easier to grok.
> the results weren't that good; it proved to be too abstract for young kids
You cannot make that conclusion as a result of the evidence. Yes, the evidence might support that conclusion, but there are many others that also could. For example, they could have just been really bad at teaching. This even seems like a likely one as it is difficult to perform such a reformulation and to do so broadly and quickly.The other reason I'm willing to accept alternative conclusions is that France and the USSR had far more success than Finland (or even America). Their success contradicts a claim that "[it is] too abstract for young kids". You'd need to constrain it to something like "[it is] too abstract for Finish kids" which I think both of us would doubt such a claim.
It seems plainly obvious that this language just means “areas of brain that activate when dealing with math problems” vs “areas of brain that activate when dealing with language problems” and yes there is hard evidence that there is a difference between them.
I feel weird answering even if I infer the right question because it feels tautological. If you have Wernicke's Aphasia does it not create the possibility that I already have but your condition resulted in misunderstanding. Given the condition does it not create a high probability that a response will similarly be misunderstood? Is not Anosognosia quite common?
Maybe I'm really misunderstanding but honestly I'm not sure what you're asking
We had basically 4 tracks: one ended with you doing algebra 1 in senior year, another ended with you doing trig in sr year, yet another that ended with trig (no precalculus), and then one that ended with you doing trig and precalc. That final class then had further subdivisions that were too small to have their own full classes: some kids just did precalc, some did calc 1 and took the AP Calculus AB exam and/or IB Math SL, while even even smaller group took AP Calculus BC and/or IB Math HL. The total number of kids who took the AP Calc AB exam in my year was 20ish, out of a graduating class of 500-600.
> the copious amount of tiering of math classes seemed to indicate to me that either we're really bad at teaching math to 80% of people, or only 20% of people will be able to handle precalculus
Or maybe it indicates that the people designing this system should be fired? Job security through complexity?
(Or maybe I’m just biased by the system I know… I’m just asking questions)
> … I’m just asking questions
I'm not disagreeing, but just wanted to point out that this phrasing is commonly used by bad faith actors. I'm not saying you're using it this way and I legitimately do not think you are. But I wanted to point it out because I think your comment could be interpreted another way and this phrase might be evidence for someone to make that conclusion.The classic example is conspiracy theorists. But lots of bad faith actors also use it to create leading questions that generally ignore or lead away from important context. Again, I do not think you're doing this. The rest of your comment makes me think you're acting in good faith.
Math classes weren't separated by grade. So, I took Algebra 2 in 8th grade alongside a cross section of the school; there was one other 8th grader, two seniors, and a selection of people in between.
There was no path to take the AP Calculus AB. Trig / Calc A was offered as two semesters, and then Calc B / Calc C was offered as two more semesters, after which you'd take the BC test. There was also no such thing as "precalculus". Trig followed Algebra II.
In my Calc C class, there were probably 8ish people, of which one or two (besides me) would have been in my grade.
Did you even read beyond the silly headline?
The article itself is about pre-testing subjects on a range of capabilities from problem solving ability to second (foreign) language learning ability, and then seeing how these correlated to the ability of the test subjects to learn to code.
The results were basically exactly what might be expected - people who learned Python the quickest were those who scored the best at learning a second language, and those who learned to wield it the best were those who had scored the best at problem solving.
Not surprisingly math ability wasn't much of a predictor since programming has little to nothing to do with math.
> Did you even read beyond the silly headline?
Yes. I'll also refer you to the HN guideline on this manner. You're welcome to disagree with me but you must communicate in good faith and unless you have a very specific reason for thinking I didn't "RTFM" then don't make the accusation.I'm happy to continue discussing, but only on those terms. In fact, I think we're in far more agreement than your tone suggests. But I think you missed the crux of my point: math isn't number crunching
“Music class is where we take out our staff paper, our teacher puts some notes on the board, and we copy them or transpose them into a different key. We have to make sure to get the clefs and key signatures right, and our teacher is very picky about making sure we fill in our quarter-notes completely. One time we had a chromatic scale problem and I did it right, but the teacher gave me no credit because I had the stems pointing the wrong way.”
It is a great read!
It's surprisingly common. Case in point: "The unreasonable effectiveness of mathematics in the natural sciences".
A normal person wouldn't be surprised that describing how something works is a good way to understand it.
One could argue that all we do is turn thoughts, senses, and memories into further thoughts and appropriate actions, which is applying a pattern, which is math. But at that point the definition is too broad to be helpful for anything but playing word games.
Oh wait neuroscientists, explains it all. A statisticians favourite target for being unable to interpret data correctly.
I don’t know about for learning but definitely for collaborating and mentoring. And it’s difficult to make a definition of mastery that excludes both of those, so I suppose after a fashion it’s right.
Despite being a professed lover of math, I scored higher on the verbal than the math SAT. There’s a lot of persuasive and descriptive writing in software, particularly if you’re trying to build a team that works smarter instead of finding more corners to cut.
Math is VASTLY different with VASTLY more concepts that are all much more abstract in nature and harder to understand the infinite numbers of different ways one mathematical construct can be applied to another. A person can "master" coding, but no one ever masters math.
So comparing math to language or to coding is silly. They're completely separate domains. Just because each of the three can encode and represent the other two doesn't make them similar in any way whatsoever.
This is beyond silly from my perspective. I know the field of CS is vast, but this seems to conflate programming with CS. My school was more theory heavy but there definitely came a point in certain paths of study where I didnt touch a line of code for a year, just pure math. I struggle to even understand how someone can think of this sentence - computer science at its core is underpinned by mathematics.
"Coding largely involves the 'logical part' of your brain. It tends to not include the 'language part' of your brain.
This is one reason why comments you add to code are so useful: they force you to engage both parts of your brain and therefore get better results.
This is also why when you go to explain a problem to a colleague, you often have a flash of brilliance and solve the problem: you are reframing the problem using a different part of your brain and the different part of your brain helps solve the problem."
I'm sure some readers here will say this is preposterous and there is no such thing as having "two parts of your brain".
To them I suggest watching:
1. "You are are two" (about people with their corpus callosum being severed) https://www.youtube.com/watch?v=wfYbgdo8e-8
2. "Conscious Ants and Human Hives" by Peter Watts https://www.youtube.com/watch?v=v4uwaw_5Q3I
It wasn’t until high school, when I tested-into the highest math class that the school offered, that I began to unlock (with some initial struggle) more logical and procedural reasoning specific to mathematics that I had always done well in, but never explicitly went above-and-beyond in, despite hints of such in arithmetic competitions that my school would hold and that sort of thing. I just think my brain works well for both the linguistic aspects of programming (more naturally) and the computational problem-solving aspects of programming. Certainly there are individuals who have strengths in both cognitive aspects, despite being more naturally-attuned to one versus the other, at least presumably.
Perhaps this shows a cognitive profile that has natural strengths in both "brains", or maybe this highlights limitations of the article's potentially narrow definitions of "language" and "math", implying a more complex intellectual landscape.
Interesting findings nonetheless.
People who end up being the best programmers have a deeper appreciation for semantics and information flow, but tend to commit more type II errors early on, making them inferior intro CS students.
Much of the CS curriculum (and typically also the required maths curriculum) in universities still favors the first type of student over the second, driving out the most capable and creative minds.
If you try programming and you don't like it chances are you won't be very good at it.
* It's a small sample, and they did not analyze the people who didn't complete the course. That's dubious. Those 6 could have had a massive influence on the outcome.
* The summary does not present the actual numbers. These are: "fluid reasoning and working-memory capacity explained 34% of the variance, followed by language aptitude (17%), resting-state EEG power in beta and low-gamma bands (10%), and numeracy (2%)". Note: numeracy, not math.
* The test result was only partially programming related. 50% consisted of the results of a multiple choice test with questions such as What does the “str()” method do?. Linguistic knowledge indeed.
* It's about completing a 7.5 hour Python course. That's learning indeed, but only the very beginning, where abstraction is not in play. The early phase is about welding bits of syntax into working order.
* The numeracy skills required are very low for such tasks, as the tasks are simple, and mainly require thinking in steps and loops, whereas numeracy aptitude is generally measured on rather problems involving fractions.
Edit: the paper uses the Rasch-Based Numeracy Scale for this, which seems to involve estimation and probabilities.
* 17% explained variance is a rather minimal result, and you cannot easily compare factors in such a small design, even if the other one is only 2%. That's a rather hairy statistical undertaking.
* Linguistic expedience might be explain the speed with which the course was followed, since the instruction is, obviously, linguistic. Hence, this factor is not necessarily related to the actual learning or programming.
* The argument from beta waves is clutching at straws.
* The argument that "perhaps women should have more of a reputation for being “good” at programming" because they score better on tests, is --however well meant-- utterly ridiculous. It reverses correlation to causation and then turns that into a fact.
* That linguistic skills are useful for programmers is widely understood. However, this is not because of the actual coding, but because the coder needs to understand the environment, the specs, the users, etc., all of which is transferred via language.
* And of course, the statistical result relies on Null Hypothesis Test Significance, which is rotten in its very foundations.
* Note that the CodeAcademy course "Learn Python 3" is 23 hours in 14 lessons.
Also, the article doesn't mention "math skills". It talks about numeracy, which is defined in a cited paper as "the ability to understand, manipulate, and use numerical information, including probabilities". This is only a very small part of mathematics. I would even argue that mathematics involves a lot of problem solving and since problem solving is a good predictor, math skills are good predictor.
Seeing as Codecademy lessons are written in English, I would think this may just be a result of participants with higher Language Aptitude being faster readers.
I do think that language skills are undervalued for programming, if only for their impact on your ability to read and write documentations or specifications, but I'm not sure this study is demonstrating that link in a meaningful way.
Well yes, my high school maths were in the high 90s - more than my language scores in French, German and Latin with some off curricular Russian. I guess being a polymath helps.
Unless you are doing an engineering or mathematical application you don't need much math, especially as you can just call a function in the vast majority of the time.
I did a number of software products and operating system modifications without using any math beyond arithmetic operations.
I was a resource for other programmers including the odd math PhD.
I learned to program when I was a kid and my maths skills were super basic. Programming can almost be distilled to something as basic as "if this, then do that", plus "do this x times". Then read API documentation and call functions that do what the docs say.
With just this basic understanding you can create a lot of stuff. The maths is obviously the foundation of computation, but to be a programming language user and build stuff, you don't actually need to understand most of it.
In university I did eventually do some math-y stuff (econ degree so prerequisites in stats, maths and even CS) and it helps with certain stuff (understanding graphics programming, ML and LLMs, plus knowing maths on its own is useful), but I still don't feel it was strictly necessary. Language and basic logic is enough IMO.
I feel the same way about starting learning programming. Repetition, repetition, repetition, until you "get good".
> All participants were right-handed native English speakers with no exposure to a second natural language before the age of 6 years
Which removes a confounder that Python mimics English syntax.
Still if this is a typical study recruiting thirty-some undergrads as subjects it's probably not generalizable, or even replicable given the same experimental setup.
I often tell people it's not that learning a language is hard, is learning that language's software library... and learning a software library doesn't feel like learning a language. More like learning a set of tools.
Programming is the manifestation of thought through the medium of a keyboard and screen. If you are a clear thinker, if you can hold multiple things in your head at once, if you can reason about things and their relations, well, you can be a strong programmer.
It seems wholly unremarkable to me that someone new to Python would not be fazed by it, given it's fundamental basis in words (print, if, etc.) Someone with a background in languages, who can think well enough to explicitly or implicitly pick up the structure of the language, is gonna do just fine. "Oh, so when I see a while, I need to end with a colon" isnt so different from "when I shout, I need to add a ! at the end"
(Java gets a special place in hell for forcing "public static void main" on every beginner.)
Math only really comes into it when you want to reason about things that have a history of being manipulated mathematically, typically for reasons of convenience. You could probably invert a matrix in SNOBOL, but its a lot easier to pull out lists and arrays and linear algebra.
In other words, lets see the follow up paper to this where Python newbies are asked to model the trajectory of a bead on a wire and see how they do.
Does answering a quiz on the contents of the first lesson on how to program in Python really encapsulate anything concrete about who will and will not be able to actually program in Python?
I've always been disturbed by the disconnect between "first lessons" on programming languages and how I personally actually learn programming languages. I can't help thinking that the researchers have measured something else other than whether people have learned to program.
But as a matter of practice, teaching programming to engineers/scientists, even to mathematicians, is an order of magnitude easier than teaching math to CS folks. Simply quiz job candidates on fp arithmetics, and see how many fail miserably.
Lately I’ve also felt language skills matter when writing concise, specific AI prompts. This has become a useful part of my programming workflow in, I suppose, the last year or so. Before that it was knowing “how to Google” well, but that’s less language-dependent in my opinion.
Probably the most valuable math classes for me, were ones that had me use algebra to solve word problems.
And, fundamentally, all languages describe the same stuff, using different tokens. That is pretty much in line with programming languages.
I believe the goal is to encourage those (young people) allergic to math but good in languages to realize they could be good at programming. That's worthy and important (though ironic to use a scientific study to do so).
As for the larger question commenters are raising, I notice often programmers reducing programming to problem-solving, and that to modeling, thence to math writ large; then they prove their point by noting that the most significant algorithms have strong and deep roots in math. The argument is: if the pinnacle of programming is math, then the more math-like programming is (in people and style), the better. (Hence: functional programming and provable algorithms are the epitome)
This argument is invariably made only by math experts, and does have the effect of reducing competition and selecting for like-minded persons, so there's at least the possibility of infection with self-interest and bias to the familiar.
I think the real distinction lies in this: with math, once you have a model, you fit things into that, and exclude other things. You do perfectly, but only in your domain. With language, any differance (any difference that makes a difference) starts to be named and tracked and dealt with somehow. It may grow in confusing ways, but it turns out (like architecture) it actually makes much more sense to adapt the structure to the use and relevance than vice-versa.
Sure, some subset of architecture is engineering by necessity, and it's probably even the hardest subset. But that's also the most portable knowledge and practice, and I would argue, thus easier to find and deploy once you've isolated such a need, particularly since mathy stuff fits nicely in a reusable library.
So math is an important but relatively isolated part of programming, and thinking it's ideal for everything is missing the point that programming matters for some purpose.
"attention is all you need" is but one example of the priority of relevance over structure in systems.
Computer science is much more than programming - and I think that most of the value derived is from being able to think about problems, which largely require the abstract type of thinking encouraged by more advanced math. Code is just a tool.
This classic article explains the real issue - like Mike Gancarz' classic on the Unix Philosophy, this is something all younger hackers should read, but few have, since these are the fundamental ideas that have created our modern world of computers: https://web.archive.org/web/20000529125023/http://www.wenet....
For example: "Matter determines consciousness." If we apply first-principles thinking—where does matter come from? This statement then becomes: "XXX created matter, and matter determines consciousness." At this point, our interest shifts to the first-principle subject "XXX," focusing our attention on it. Who is XXX? God?
In this thought process, we use the subject-predicate-object grammatical structure to trace back the original subject "matter" in "matter determines consciousness"—where does matter come from? Although this formal reasoning does not involve specific mathematical formulas, it indeed employs formal logic to uncover a flaw and opens the door to deeper rabbit-hole
It you ever need to get into the guts of a system or need to solve bleeding edge problems for which good abstractions don't yet exist, the "math brain" becomes significantly more relevant.
I say this as someone who studied literature and philosophy. The majority of what I know about programming and software engineering I either taught myself or learned from the tutelage of others on the job. Early on in my career, a solid mathematics background was, indeed, not that relevant. These days, though, I'd be lost without it. Whether you like it or not, when it comes to doing real engineering you necessarily need to establish bounds and prove things about those bounds and typically you'll need to do this numerically or at the very least using inductive structures. Linguistic aptitude is still relevant, but it helps less in these cases.
An interesting book that illustrates the evolving societal perception of mathematicians (and by extension, computer scientists) over time is “Duel at Dawn”. The modern (sic recent) notion is the reclusive genius (maybe even suggestion of autism) who possesses the alien, superhuman, unattainable, too-cool-for-you ability to process numbers and information that you can only be born with. (Those familiar with the TV show “the big bang theory” would recognize the trope of the “Sheldon Cooper” character.) This is False.
The reality is that no one is born with the super-human ability to do anything - anyone who is very good at something has worked very hard to get good at that thing regardless of the perception.
edit: my initial criticism of “the study” was based upon the article. On a skim of the actual cited paper, I revised my specific criticism, but the actual paper still comes off as no more than a mild eugenics argument dressed in psychology and statistics jargon.
Qem•12h ago
floxy•11h ago
codr7•10h ago
Jensson•10h ago
Qem•9h ago
codr7•5h ago
graemep•11h ago
jimbokun•11h ago