EDIT: I found the quote, from chapter 6:
Psychometricians have closely questioned American scientists of this first modern generation, curious to know what kind of men they were—there were few women among them—and from what backgrounds they emerged. Small liberal arts colleges in the Middle West and on the Pacific coast, one study found, were most productive of scientists then (by contrast, New England in the same period excelled at the manufacture of lawyers).
...
Theoretical physicists averaged the highest verbal IQs among all scientists studied, clustering around 170, almost 20 percent higher than the experimentalists. Theoreticians also averaged the highest spatial IQs, experimentalists ranking second.
[1] https://worldscientific.com/worldscibooks/10.1142/q0436#t=ab...
The field probably does itself a disservice by overemphasising math. That framing can push people away who might actually do really well, especially those strong in reasoning, abstraction, or communication. Linked study is a good reminder to rethink how we present programming imo.
You can see visual reasoning as a little cheat computation, you can run math problems through your sense-determining brain, which is what brains are really good at (robots struggle with our levels of dexterity). But the fact remains that you can only visualize in low dimensions, and there are infinitely many dimensions.
Note: You can reduce many problems to 3d, but also many problems in 3d have configuration spaces with much higher dimension, so there's some nuance.
I remember when I first started working on my Master's project on wireless sensor networks, my advisor sat me down and said "I think I know a good project for you. I want you to print out the source code for TinyOS, study it for a week, and come back to me when you think you know enough to make these changes." This was a sort of formative experience for me, and ever since when joining a new project I made sure to take the time to read through the code to understand how things fit together.
Terrible at math, I hate it and feel dyslexic trying to read most mathematical writing. I excelled at it in elementary school, then quickly came to feel frustratingly stupid at it as it became less about algorithms (more on that in a bit...) and all about equations and abstract stuff with unknown applications.
However, programming was natural and easy to pick up. I've repeatedly had to take more time convincing myself I actually understand some supposedly "hard" thing, like pointers or recursion, than it took to learn them in the first place, because they were in fact very easy to understand so I kept second-guessing myself—"I must not get it, because that was easy". I've been the go-to guy for "hard" and "low-level" problems basically everywhere I've worked.
What I've noticed is that when I must read math, the only way I can get any headway is to turn everything into steps, through which some example values may pass and affect one another. I have to turn it all into algorithms. Algorithms, I can get. Attempts to express meaning through equations and proofs, no, I have to painstakingly turn every single boundary between every symbol into a step and "walk through" it to have any hope of understanding it, and once I do, this new understanding only barely illuminates the original representation.
I think programming clicked for me because, as typically encountered and taught, it's very heavy on algorithms and very light on other varieties of mathematical presentation. Plus, having so very much more context available about what variables represent and what routines do, than a jumble of letters and symbols. FFS, if we say Perl is line noise, what's mathematical writing? Straight gibberish from a brain-wrecked Cthulhu cultist? Perl's the clearest thing in the world by comparison!
... where I do run into trouble is languages with "mathy" syntax, where the idiomatic style favors single-letter variables and accomplishing-things-by-asserting-equality. I can't read Haskell to save my life. Put the same supposedly-tricky concepts (monads, type classes) in any more-ordinary language, and it's easy, but when I tried to learn them using Haskell, I couldn't get anywhere at all. Shit, it takes me forever just to understand fizzbuzz-level programs written in Haskell.
It prevented me from having a CS degree, I was unable to complete the math courses, but as far as actual programming and "software engineering" goes (design, etc) it's never hindered me. I can work out the logic and I let the computer do the math.
Edit: I'm downvoted below zero for this comment. I don't know what people are so offended by?
> It prevented me from having a CS degree, I was unable to complete the math courses, but as far as actual programming and "software engineering" goes (design, etc) it's never hindered me. I can work out the logic and I let the computer do the math.
This is what's wild to me: I have a long, successful career in a "STEM" field that's allegedly math-heavy, while being practically incapable of working with math. Like, it's never even been slightly a problem. I can't relate at all to characterizations of programming as heavy on math. It's never been my experience of it, and at this rate, probably never will be. If it were, I'd for-sure be in a different job.
I think one "verbal" skill that has served me well is fast reading. When I had to read a 300 page novel once a week you learn how to skim for key elements, which is immensively useful for getting up to speed in a new codebase/language or locating a bug.
> though 18 years in food service means I'm really quick at estimating percentages to within delta
This is an interesting comment! No trolling: Were you a bread or pastry baker? I am curious to hear more about this experience.This is me and also many of the CS students in my cohort, and AFAIK something that universities actively selected for, and students also self-selected around in an era of RTFM/MUDs/IRC before LLMs or youtube. The best programmers I've worked with are still always very linguistically brained.. polyglots even when they didn't have to be, or with a long track record of engaging with difficult literature. If nothing else.. just very witty in that certain way that's meta-cognitive, meta-linguistic.
This is still true I think but it's much harder see out in the wild due to the degree/career trajectory popularity. Plus as long we're optimizing for leet-coding even though built-from-scratch algorithms are a very rare need compared to skills with good design/exposition.. naturally the math-brain is favored.
Some schools like MIT might have required more, but on average what I wrote was about it. Has it increased since then? Based on the new hires I've seen the last decade I'd have guessed the math requirements were mostly the same.
Anyway, both computation and math are grouped under "apriori" knowledge. Any semantic distinction is ultimately silly. But we could just as easily be teaching programming as a craft in the context of the real world—I think this is closer to how it's done outside the US. I am not at all convinced the American style is what people ought to be paying for.
Yeah, I never thought this made sense, but so many people did; and, I always hear people on Slashdot talking about how programming IS math. None of that has been my personal experience, and I'm coming up on 21 years as software engineer. Discrete was the ONLY math course that I really enjoyed and did well in the first time around. For me, this always made sense.
I can count the times I've ever applied math past approximately high school algebra 1, on one hand. Period, in private life, in hobbies, at work. I'm not sure I've ever used any "college level" math, for anything at all.
I've, and other programmers I've known, gotten excited on the very few occasions anything even slightly mathematically-tricky came up, precisely because it almost never happens.
You chose CS but really wanted Software Engineering. Discrete Mathematics was all you needed for that.
It helped that Python was meant to resemble natural language. I had learned C++ and Perl before but they never stuck, because I never made the connection to language. Ironically, since Perl was designed by a linguist!
Kurt Vonnegut: See, I came up through a chemistry department.
Charlie Rose: Yeah, right.
Kurt Vonnegut: And so I wrote and there was nobody there to tell me whether it was any good or not. I was just making my soul grow, writing stories.
There's some stuff about his opinion on training for writing that could be relevant:
https://charlierose.com/videos/25437
I don't think it's fair to attribute anything to anything. Stuff comes from all over the place. In other words, attributing programming prowess to math was a mistake, and we are making the same mistake again attributing it to language.
---
Just one more:
Kurt Vonnegut: --consider himself in competition with a world's champion. And this is one reason good writers are unlikely to come from an English Department. It's because the English Department teaches you good taste too early.
I think his main point is when we put something on a pedestal, we actually limit people, whether that be math or language.
My experience is that they spit out reasonably looking solutions but then they don't even parse/compile.
They are OK to create small spinets of code and completion.
Anything past that they suck.
It's actually hilarious that AI "solved" bullshiting and and artistic fields much better and faster than say reasoning fields like math or programming.
It's the supreme irony. Even 5 years ago the status quo was saying artistic fields were completely safe from the AI apocalypse.
Just as an LLM may be good at spitting out code that looks plausible but fails to work, diffusion models are good at spitting out art that looks shiny but is lacking in any real creativity or artistic expression.
My experience with that is that artistic milieus now sometimes even explicitly admit that the difference is who created the art.
"Human that suffered and created something" => high quality art
"The exact same thing but by a machine" => soulless claptrap
It's not about the end result.
A lot could be written about this but it's completely socially unacceptable.
Whether an analogous thing will happen with beautiful mathematical proofs or physical theories remains to be seen. I for one am curious, but as far as art is concerned, in my view it's done.
This has nothing to do with whether a human or AI created the art, and I don't think it's controversial to say that AI-generated art is derivative; the models are literally trained to mimic existing artwork.
Your "creativity" is just "high temperature" novel art done by the right person/entity.
This was something already obvious to anyone paying attention. Innovation from the "wrong people" was just "sophomoric", derivative or another euphemism, but the same thing from the right person would be a work of genius.
It's like people enjoy extrapolating their surprise when it comes to LLMs, and I don't think it's very helpful.
Oddly, I also use spatial intuition when thinking about stuff like stacks and the shape of data structures.
You sure about that? How about inductive proofs?
I would just say that language is more familiar to most. Mathematics are also languages, but more formal and foreign to most.
Edit: added a definition of apriori knowledge.
Edit2: to put this another way, nobody is arguing that recursion doesn't exist. Or that it is empirically-derived. No, it's a useful construct to show certain relations.
Edit3: added a sentence
Edit4: The extent to which our own grammars are inherently recursive vs this being culture or technology is irrelevant to identifying the concept of recursion as an apriori, linguistic concept.
Edit5: i suppose you might also be referring to the idea that we naturally process recursion. I mean, we clearly, evidently do; whether or not that's inherent to being human is a separate question entirely. Hell in the free software world there's a whole recursive acronym meme that taps into some part of our brain and tickles it.
With that argument everything is fundamentally linguistic since everything is communicated using a language.
Can you come up with a more reasonable argument?
If recursion was just writing the function 10 times like you did in language then people wouldn't struggle with it.
So in this case, "recursive function" would be "clause" or something like that; I'm no linguist. But clauses can embed clauses which can embed further clauses, etc.
I think your usage of recursive functions is just high-level logic—you're describing an inductive proof. We also frame a lot of our social games as recursive processes. But these are conscious processes that we can evaluate consciously; the recursion in spoken language is largely unconscious and very shallow.
But people are constructing sentences, not grammars. When you construct a grammar you can add a recursive part to it, that is true, just like in a programming language, but constructing grammars is not what people mean with language skills.
A sentence can't be recursive since languages in themselves has no concept of applying a concept, for that you need an interpretation of the language references. For example, you can have a recursive function written in a programming language that doesn't have a recursive grammar, the concepts are different things.
1. Our spoken and especially written grammar is recursive. We do handle this unconsciously. This is not related to our ability to reason about recursion at a high level, and recursive grammars are not necessary to do so. This is not a skill in the normal sense and we have only (very) limited ability to improve our capacity to interpret deeply nested grammars. However, this is still a useful illustration of what recursion IS, which is why I brought it up.
2. Language also introduces the ability to semantically reason about recursiveness. This is still a linguistic thing—you need a symbol and relations among symbols in order for recursion to be meaningful—but this is a skill and is likely very related to linguistic skill. This is the part that really helps you to program: ultimately, you're just reasoning about symbols and looking for incoherency.
What does this mean exactly?
> What does this mean exactly?
What does this mean exactly?
- Defining recursion is linguistic
- Defining a function recursively is mathmatic
Besides, a lot of what people mean when they say they're bad at math is that they're bad at arithmetic, which is honestly understandable.
But that isn't what we mean with recursive function. We don't call this recursive:
x = x + 1
Its just incrementing x.That's not a recursive function as it's written, but you could certainly consider it a form of symbolic recursion. This just isn't a very useful characterization in an iterative/imperative context. You could frame incrementing as recursive, though—this is just peano axioms/church encoding.
I've always wondered why FP isn't more popular. I concluded it's because most folks don't like thinking like abstract math.
I think the problem-solving part of coding requires math skills, while the organization part requires writing skills. The organization part affects the problem-solving part, because if you write messy code (that you can’t reread once you forget or extend without rewriting) you’ll quickly get overwhelmed.
Writing large math proofs also requires organization skills, since you’ll refer to earlier sections of your proof and may have to modify it when you encounter issues. But to me, math seems to have more “big steps”: sudden insights that can’t be derived from writing (“how did you discover this?”), and concepts that are intrinsically complicated so one can’t really explain them no matter how well they can write. Whereas programming has more “small steps”: even someone who’s not smart (but has grit) can write an impressive program, if they write one component at a time and there aren’t too many components that rely on each other.
Just like in math.
BTW relational DBs are math.
It's funny, reading the post you're replying to, I basically read it as
> I don't need math, I need <math, but by another name>
My teenage daughter used to complain about math, and I spent some time trying to explain to her that we use math every day... EVERY day. Now, when I see her do something that was math (even if it's not obvious it was math), I say "Math... every day". I say that a lot.
Also, yes, my daughter finds me annoying. But also funny; but likely not for the math thing.
> I don't need math, I need <math, but by another name>
This seems to be how it always goes. I think we've confused a lot of people by conflating math with arithmetic.So how much of programming is maths? Before we answer that, let's answer: How much of maths is actually maths? Because first we define maths, and then we define programming based on whatever that is, but until we have that first concrete definition this discussion cannot occur.
I will add that "it is taught by the maths department in college" is a flimsy argument, and frankly one the Physics department in particular would mock.
That means that definition shifts over time. For example, courses on numerical analysis, graph algorithms, programming, and on compilers used to be part of “what you’d learn in a math department”.
It likely also even today will show geographical variation.
I find this distinction useful in the abstract, that one can engage different parts of the brain for different components of development. This probably explains why a well-written DSL can be so powerful in the right context.
Firstly, computer science is math.
Secondly, I remember covering graphs in a discrete math course back when I was in college.
> What if you do it in SQL?
SQL is more-or-less a poor implementation of relational algebra. Ie, math.
But it's hardly a useful grouping any more. You can study and do well in computer science with minimal knowledge of most of the core mathematical subjects.
While graph theory certainly crosses over into math, you can cover most of the parts of it relevant to most computer science as a discussion of algorithms that would not be the natural way of dealing with them for most mathematicians.
You will fail at Theoretical Computer Science without mathematical proficiency. Go read some textbooks and papers in theoretical CS. It is a subfield of mathematics. Theorems and proofs. Rigorous and difficult mathematics.
I wrote my MSc thesis on the use of statistical methods for reducing error rates for OCR, and most of the papers in my literature review hardly required more than basic arithmetic and algebra.
So I stand by my statement.
Sure, there are subsets of computer science where you need more maths, just like in any field there are sub fields where you will need to understand other subjects as well, but that does not alter what I claimed.
EDIT:
Some authors are quicker to pull out the maths than others, and frankly in a lot of CS papers maths is used to obscure lack of rigor rather than to provide it. E.g the problem I ran across when writing my thesis was that once you unpacked the limited math into code you'd often reveal unstated assumptions that were less than obvious if you just read their formulas.
I agree is not a useful grouping in practice. I'm just interested in what makes you think like you do.
What I claimed was that in computer science we often discuss things in terms that would not be the natural way of dealing with it in maths. We do that because our focus is different, and our abstractions are different.
It doesn't mean it's not math. It means it's not useful to insist that it isn't a different field, and its obtuse when people insist it's all the same.
> Most coding doesn't need much of any math past boolean logic and very basic set operations
Coding IS math.Not "coding uses math", I mean it is math.
Mathematicians do not deal in objects, but in relations among objects; they are free to replace some object by others so long as the relations remain unchanged. Content to them is irrelevant; they are interested in form only.
- Poincare[0]
I don't know how you code, but I don't think I'm aware of code that can't be reasonably explained as forming relationships between objects. The face we can trace a program seems to necessitate this.[0] https://philosophy.stackexchange.com/questions/22440/what-di...
But I'll refer you to a longer conversation if it helps https://news.ycombinator.com/item?id=43872687
Coding used to be very close to pure math (many early computer science classes were taught in the Math Department in universities) but it has been so far abstracted from that to the point that it is its own thing and is as close to math as any other subject is.
> By that same logic you could also say that language is math
Not quite, but the inverse is true. The language to math direction doesn't work because a lack of formalism. I can state incomprehensible sentences or words. (There's an advantage to that in some cases!) but when you do that with code you get errors and even you do it with math its just that there's no compiler or interpreter that tells at yousince you can express paradoxes with match, perhaps not that different.
The contradiction is used in proof formulation, specifically to invalidate some claim. I don't think this is what you're implying.
The latter is what it contextually sounds like you're stating; things like the Banach-Tarksi Paradox. There's no self-contradiction in that, but it is an unexpected result and points to the need to refine certain things like the ZFC set theory.
I'd also stress that there are true statements which cannot be proven through axiomatic systems. The Halting Problem is an example of what Godel proved. But that's not contradictory, even if unexpected or frustrating.
> coding and math being equivalent
Please see lambda calculus. I mean equivalent in the way mathematicians do: that we can uniquely map everything from one set to another> I mean equivalent in the way mathematicians do
That sounds like you're backing off from your original claim, probably because it is impossible to defend.
That you can use mathematics to describe code doesn't seem very different from using math to describe gravity, or the projected winner in an election, or how sound waves propagate.
Isn't the primary purpose of math to describe the world around us?
Then it shouldn't be surprising that it can also be used to describe programming.
In the real world, however, software engineering has nothing to do with mathematical abstractions 99% of the time
Though interpreting a CRUD app as a theorem (or collection of theorems) doesn’t result in an interesting theorem, and interpreting a typical theorem as a program… well, sometimes the result would be a useful program, but often it wouldn’t be.
It's not the type of thing that gets mathematicians excited, but from an engineering perspective, such theorems are great. You can often blindly code your way through things by just following the type signatures and having a vague sense of what you want to accomplish.
It's actually the halting problem that I find is not relevant to practical programming; in practice, CRUD apps are basically a trivial loop around a dispatcher into a bunch of simple functions operating on bounded data. The hard parts have been neatly tidied away into databases and operating systems (which for practical purposes, you can usually import as "axioms").
> It's not the type of thing that gets mathematicians excited
Says who? I've certainly seen mathematicians get excited about these kinds of things. Frequently they study Programming Languages and will talk your ear off about Category Theory. > You can often blindly code your way through things by just following the type signatures and having a vague sense of what you want to accomplish.
Sounds like math to me. A simple and imprecise math, but still math via Poincare's description. > in practice, CRUD apps are basically a trivial loop around a dispatcher into a bunch of simple functions operating on bounded data
In common settings. But those settings also change. You may see those uncommon settings as not practical or useful but I'd say that studying those uncommon settings is necessary for them to become practical and useful (presumably with additional benefits that the current paradigm doesn't have). > Isn't the primary purpose of math to describe the world around us?
No, that's Physics[0]. I joke that "Physics is the subset of mathematics that reflects the observable world." This is also a jab at String Theorists[1].Physicists use math, but that doesn't mean it is math. It's not the only language at their disposal nor do they use all of math.
> software engineering has nothing to do with mathematical abstractions 99% of the time
I'd argue that 100% of the time it has to do with mathematical abstractions. Please read the Poincare quote again. Take a moment to digest his meaning. Determine what an "object" means. What he means by "[content] is irrelevant" and why only form matters. I'll give you a lead: a class object isn't the only type of object in programming, nor is a type object. :)[0] Technically a specific (class of) physics, but the physics that any reasonable reader knows I'm referencing. But hey, I'll be a tad pedantic.
[1] String Theory is untestable, therefore doesn't really reflect the observable world. Even if all observable consequences could be explained through this theory it would still be indistinguishable from any other alternative theory which could do so. But we're getting too meta and this joke is rarely enjoyed outside mathematician and physicist communities.
Going on a total tangent, if you'll forgive me, and I ask purely as a curious outsider: do you think math could have ever come into being if it weren't to fill the human need of describing and categorizing the world?
What would have been the very beginning of math, the first human thought, or word or action, that could be called "math"? Are you able to picture this?
> do you think math could have ever come into being if it weren't to fill the human need of describing and categorizing the world?
I'm a bit confused. What exactly is the counterfactual[0] here? If it is hyper-specific to categorizing and describing then I think yes, those creatures could still invent math.But my confusion is because I'm having a difficult time thinking where such things aren't also necessary consequences of just being a living being in general. I cannot think of a single creature that does not also have some world model, even if that model is very poor. My cat understands physics and math, even though her understandings are quite naive (also Wittgenstein[1] is quite wrong. I can understand my cat, even if not completely and even though she has a much harder time understanding me). More naive than say the Greeks, but they were also significantly more naive than your average math undergrad and I wouldn't say the Greeks "didn't do math".
It necessitates a threshold value and I'm not sure that this is useful framing. At least until we have a mutual understanding of what threshold we're concerned with. Frankly, we often place these contrived thresholds/barriers in continuous processes. They can be helpful but they also lead to a lot of confusion.
> What would have been the very beginning of math
This too is hard to describe. Mull over the Poincare quote a bit. There's many thresholds we could pick from.I could say when the some of the Greeks got tired of arguing with people who were just pulling shit out of their asses, but that'd ignore many times other civilizations independently did the same.
I could say when the first conscious creature arose (I don't know when this was). It needed to understand itself (an object) and its relationship to others. Other creatures, other things, other... objects.
I could also say the first living creature. As I said above, even a bad world model has some understanding that there are objects and relationships between them.
I could also say it always was. But then we get into a "tree falls in a forest and no one is around to hear it" type of thing (also with the prior one). Acoustic vibrations is a fine definition, but so is "what one hears".
I'd more put the line closer to "Greeks" (and probably conscious). The reason for this is formalization, and I think this is a sufficient point where there's near universal agreement. In quotes because I'll accept any point in time that can qualify with the intended distinction, which is really hard to pin-point. I'm certainly not a historian nor remotely qualified to point to a reasonable time lol. But this also seems to be a point in history often referenced as being near "the birth" and frankly I'm more interested in other questions/topics than really getting to the bottom of this one. It also seems unprovable, and I'm okay with that. I'm not so certain it matters when that happened.
To clarify, I do not think life itself necessitates this type of formalization though. I'm unsure what conditions are necessary for this to happen (as an ML researcher I am concerned with this question though), but it does seem the be a natural consequence of a sufficient level of intelligence.
I'll put it this way, if we meet an alien creature I would be astonished if they did not have math. I have no reason to believe that their math would look remotely similar to ours, and I do think there would be difficulties in communicating, but if we both understand Poincare's meaning then it'll surely make that process easier.
Sorry, I know that was long and probably confusing. I just don't have a great answer. Certainly I don't know the answer either. So all I can give are some of my thoughts.
[0] https://www.inference.vc/causal-inference-3-counterfactuals/
That's not unique, all quantitative theories allow small modifications. Then you select parsimonious theory.
We didn't:
bring up the Strong Church-Turing Thesis in discussions of programming languages meant for use by humans.
Mathematics is a very extensive field, and covers a vast amount of subjects.
For the same reason it can't be said that mathematics is equivalent to coding, as there are many things in mathematics that are not relevant for coding.
However, by far the most interesting parts of coding are definitely related to mathematics.
Math is about abstractions and relations. See the Poincare quote again.
Plus, the Programming Languages people would like to have a word with you. Two actually: Category Theory. But really, if you get them started they won't shut up. That's either a great time or a terrible time, but I think for most it is the latter.
Wheeler is arguing that if a tree falls in the forest, and there is nobody to hear it, that it still makes a sound because there are things that interact with the sound. But if the tree fell in a forest and there was nothing else in the universe then there is no sound because there is no observation.
It helps to read the whole thing[0] and to understand the context of the discussion. This is meta-physics and a deep discussion into what the nature of reality is. Ian Hacker has a good introduction to the subject but I find develop grave misunderstandings when they also do not have the strong math and physics background necessary to parse the words. Even people who understand the silliness of "The Secret" and that an observer need not be human often believe that this necessitates a multi-verse. A wildly convoluted solution to the problem of entropy not being invertible. Or closer to computer terms, a solution that insists that P = NP. There is information lost.
If you wanna argue that there's no difference between the word cup and a cup itself because there is no word without the observer who has the language, then yeah.
Programming is an expression of logic, which is absolutely mathematics.
But then we also have to think about naming variables and classes, structuring our code so that it is more readable by other developers, and so on. That's less about formal reasoning and more about communication.
There is an engineering aspect to programming (prototyping, architecture, optimization, etc). It's a combination of mathematics and engineering. Software Engineering.
Structuring sentences and naming variables so that it is easier for other people to understand is less about formal mathematical reasoning, and more about communication.
You could name a variable x, y, or Banana, but it doesn't change the logic.
I mean the reason we get mad at this is because it is someone destroying "society" in some sense. Even if that society is your team or just the set of programmers. It would be a pretty dick move were I to just use a word that significantly diverged from conventional meaning and expected you to mull it over. Similarly if I drop a unknown out of context math equation. It would be meaningless.
And I'm on your side, really! I strongly advocate for documenting. And let's be real, the conclusion of your argument more strongly argues for documentation than good variable names. Because variable names are much more constrained and much more easily misinterpreted considering how any word has multiple definitions. Surrounding code is often insufficient to derive necessary contextualization.
Mathematicians do have to deal with difficulties in naming things.
> They just use i, x, etc. all over the place
I do agree with your point btw, but I did want to note that there are good conventions around symbols. The brevity is heavily influenced by the medium. Variable names sucked when you had punch cards. It's still burdensome to write long names when using paper, chalkboard, whiteboard, or any system that doesn't have autocomplete.In general, lower case letters are used as constants, excluding x,y,z,t,i,j,k (sometimes u,v,w). It isn't a hard rule, but strong preference to begin at the beginning of the alphabet for these. Capital letters usually are held for things like Variable Sets (like random variables). Greek letters need context for constants or variables. BB and Cal typefaces for sets (e.g. Real Numbers, Integers). And much more.
I think a lot of the difficulty in it is that these "rules" or patterns are generally learned through usage and often not explicitly stated. But learning them can really help read unfamiliar topics and is why "notation abuse" leads to confusion. But after all, math is all about abstraction so technically any symbol will do, but no doubt some are (significantly) better than others for communicating.
There are two hard things in Computer Science:
- Cache Invalidation
- Naming Things
- Off-by-One Errors
and also philosophy.
The murky world of software patents would like a word about that.
For me, coding also feels artistic at times. But we see artistic beauty in mathematics all the time, so no wonder. But often I look at ugly code, and maybe that is subjective, but fixing that code makes it also feel prettier.
It is clear all developers have their owm artistic style, and probably that is why there is so much disagreement in the industry. Maybe we are lacking the pure mathemetical language to describe our intent in a more beutiful and precise way that is more clearly the-right-way. As in how we find beauty in simple physics equations.
> Not "coding uses math", I mean it is math.
Arguably writing a novel is math, if you use the right definition of math. But sometimes its more helpful to use more informal definitions that capture what people mean then what is technically accurate.
No, not always. Quite a lot of high-level code doesn't require any math at all. It doesn't take math to perform CRUD operations, which account for a lot of programming work. Sure, the underlying machine code is all based on math, but the higher level programming doesn't need to involve a single math equation for it to be useful. Let's see where the goalposts move now...
But the CRUD logic is so basic and boring, so obvious, that it doesn't require any thought.
Which code? The machine code that underlies everything? Or the lines of simple high-level CRUD that don't even need a single number or logic statement to function? Not all programming has to be mathematical or even logical at a high enough level, and I say this as someone who's been coding assembly language for 40 years.
Those lines ARE mathematical logical statements.
Each line defines logical operations that are executed by the computer.
Same for high level or low level programming. It's all logic.
Assigning a variable is a mathematical operation. It is a logical state transition. This is formalized in Programming Language Theory.
We are not talking about quarks. We are talking about lines of code which are objectively mathematical statements, no matter how non-mathematical they seem to you subjectively.
You start with a practical problem to solve, and a computing machine that can perform logical operations on data. Your job is to figure out a correct sequence of logical operations that will solve the problem. The program that you have written is a mathematical structure. It is mathematics.
In the case of code, we can argue that the map is the territory.
Those business requirements are inputs to the process of writing the code.
Once the code is actually written, that exists as a formal logical system, defined by mathematics, not business requirements.
>Once the code is actually written, that exists as a formal logical system, defined by mathematics
I still think that's not code, but your favorite model of code. For spellchecker language is defined by mathematics too: it splits text into words by whitespace, then for each word not found in dictionary it selects best matches and sorts them by relevance. Oh and characters are stored as numbers.
Writing code IS a math skill. When writing code you are writing logic in a formal system. Logic is mathematics.
You may be thinking that mathematics is just like doing arithmetic or solving equations. It is way deeper and broader than that.
> I still think that's not code, but your favourite model of code
Code is not just modelled through mathematics, it is actually defined by mathematics. It is fundamentally a mathematical construct, grounded in formal semantics.
Code is not modelled mathematically, it is defined mathematically.
It exists as an abstraction which is fully defined by operational semantics and denotational semantics, not modelled or approximated.
In the counter example of a quark, that exists in nature and is modelled by mathematics, but not defined by mathematics.
If you're going to accuse someone of confusing the map with the territory, you really should make sure you aren't making the same error.
> It's like trying to argue about the distinction between U(1), the complex numbers with magnitude 1, and the unit circle, and getting upset when the mathematicians say "those are 3 names for the same thing". Or saying that writing C is programming but writing in a functional language like Scala or Haskell (or Lean) is not.
As ndriscoll suggests, it is tautological. I mean look at what I said. I really need you to hear it. I said that coding is math. So what I hear is "How programming languages helps with programming languages?" Why are you expecting me to hear anything different? > What math says about zero based indexes?
Start at 0? Start at 1? Who cares, it is the same thing. The natural numbers, non-negative integers, integers, even integers, who cares? They're the same thing. And who cares about indexing at 0 or 1 in programming? That's always been a silly argument that's inconsequential. > How do you prevent off by one errors?
By not being off by one? What's the question? Like being confused about if you start at 0 or start at 1 and how to get the right bound? It is a shift from one to the other, but they are isomorphic. We can perfectly map. But I really don't get the question. You can formalize these relationships with equations you know. I know it isn't "cool" but you can grab a pen and paper (or a whiteboard) and write down your program structure if you are often falling for these mistakes. This seems more about the difficulties of keeping track of a lot of things in your head all at once. > How do you prevent buffer overflows?
By not going over your bounds? I'm so confused. I mean you are asking something like "if f(x) = inf when x > 10, how does math help you prevent the output of the function from being infinite?"Maybe what will help is seeing what some of the Programming Languages people do and why they like Haskell[1].
Or maybe check out Bartosz Milewski[2,3]. His blog[2], is titled "Bartosz Milewski's Programming Cafe: Category Theory, Haskell, Concurrency, C++". It may look very mathy, and you'd be right(!), but it is all about programming! Go check out his Category Theory Course[3], it is _for programmers_.
Don't trust me, go look at papers published in programming language conferences [4]. You'll find plenty of papers that are VERY mathy as well as plenty that are not. It really depends on the topic and what is the best language for the problems they're solving. But you'll certainly find some of the answers you're looking for.
Seriously, don't trust me, verify these things yourself. Google them. Ask an LLM. I don't know what to tell you because these are verifiable things (i.e. my claims are falsifiable!). The only thing you need to do is look.
[0] https://news.ycombinator.com/item?id=43882197
[1] https://excessivelyadequate.com/posts/isomorphisms.html
[2] https://bartoszmilewski.com/
[3] https://www.youtube.com/watch?v=I8LbkfSSR58&list=PLbgaMIhjbm...
Mentioning Scala is ironic, it's very light on math spik and to begin with was created to unify object oriented with functional programming, which is mathematically meaningless, because both are Turing complete and thus equivalent, tautological.
>So what I hear is "How programming languages helps with programming languages?"
Oh, right, in mathematics axiom is argument, but in reality it isn't. In programming you should assert what you assume, otherwise your assumptions can be wrong due to divergence from reality, but there no reality in mathematics, only fantasy, so you can't understand this with mathematics alone.
No. Code is an abstraction. It exists as a logical structure, grounded in mathematical logic.
> It doesn't take math to perform CRUD operation
Yes it does. Just because the objects you are working with aren't numbers doesn't mean it isn't math. In fact, that's my entire point. It is why I quoted Poincare in the first place. He didn't say "numbers" he said "objects".In a typical implementation these are database operations. That involves relational algebra operations, state transitions, boolean logic.
The READ part can be a very complex SQL query (composed of many algebraic operations) but even the simplest query (SELECT * from t where ID = 1) is filtering a set based on a predicate. That is mathematics.
No one is moving goalposts. Set theory and logic are at the foundations of mathematics.
This is math:
{x | x.id = 1}
OTOH, a SQL query is a SQL query.
This thread is hilarious though. It's like
- Cashier: here is your change.
- Customer: you did math!
- Cashier, no, I gave you change.
- Customer: that IS math!
- Cashier: You mean, I used math to give you change?
- Customer: No, giving change doesn't use math, it IS math!!!!" [2]
= D
Moving along.. FWIW, indeed SQL was created to model set theory (relational algebra and tuple calculus), the close relationship is no accident of course [0][1])
> No one is moving goalposts
I feel too they are.
First goal post:
> Coding IS math. >> No, not always. Quite a lot of high-level code doesn't require any math at all. It doesn't take math to perform CRUD operations
Second goal post:
> CREATE, READ, UPDATE, DELETE are fundamentally mathematical in nature
CRUD is closely related to SQL, and SQL is closely related to various mathematics. Are they identical and therefore equivalent? No - because your database is not going to like it when you write "{x | x.id = 1}", and the Oracle DB might not like something that you can write for your Postgres DB.
[0] https://simpleprogrammer.com/mastering-sql/
[1] https://en.wikipedia.org/wiki/SQL
[2] To quote: """Not "coding uses math", I mean it is math""" @ https://news.ycombinator.com/item?id=43872771
Goes back to this ridiculous proposition:
- Cashier: You mean, I used math to give you change?
- Customer: No, giving change doesn't use math, it IS math!!!!" [2]
The proposition is that "code IS math", not defined by, not uses, not inspired by, not relies on, not modeled after, but IS.
Computer programs are proofs[1]. This is intuitively and also formally true. You would agree writing proofs is doing math, yeah? Then obviously writing a computer program is also doing math.
Like I have a degree in math and have been a software engineer for over a decade. I do not know what distinction people are trying to get at. It's like trying to argue about the distinction between U(1), the complex numbers with magnitude 1, and the unit circle, and getting upset when the mathematicians say "those are 3 names for the same thing". Or saying that writing C is programming but writing in a functional language like Scala or Haskell (or Lean) is not.
[0] Modulo details like NULLs and multi-set semantics, but surely that's not the distinction?
[1] Up to isomorphism
This is an interesting example: "Or saying that writing C is programming but writing in a functional language like Scala or Haskell (or Lean) is not."
The first part is everything we need to look at. Are we saying that writing C is equivalent and equal to the entirety of all programming? That if you're programming then you are writing C code. No, there is an implied "is a form of" in there. Given the other clarifications and that so many people are claiming to be mathematicians, I would have expected the precision to say exactly "C is a form of programming" rather than "C is programming."
Turns out, the analogy of saying "the set of reals is the set of naturals" is more fitting compared to sets that are actually equal.
In math are highly concerned with structures and abstraction. We have things called operators. They aren't just addition and multiplication. We also use those words to describe different operations. They have things like groups, rings, fields, and algebras. Yes, plural.
The purpose of these things is to create logical frameworks. It matters not what the operators are. Nor does it matter what objects we operate on. Poincaré is explicitly saying this.
The part you're not understanding is the abstraction. This is what math is about. It is also why the Programming Language people are deeper in the math side and love Category Theory (I have a few friends who's PL dissertations are more math heavy than some math dissertations I've seen). It's no surprise. What's a function? How do you abstract functions? How do you define structure? These are shared critical questions. PL people are more concerned with types but I wouldn't say that makes them any less of a mathematician than a set theorist.
We can perfectly describe the CRUD operations with set theory. Do most programmers need concern themselves with this? Absolutely not. But is it something people designing those operations and systems is thinking about? Yes.
I'd encourage you to learn some set theory, abstract algebra, and maybe a bit of cat theory. It'll make these things pretty clear. But I'd caution about having strong opinions on topics you aren't intimately familiar with. Especially when arguing with those that are. Frankly, CRUD is a terrible example. I'm confident a quick google search (or asking a GPT) would quickly point you to relational algebra. It's discussed in most database classes. It certainly was in the one I was an assistant for.
We are talking about textbook CS fundamentals.
As someone that is "on the other side of the fence", and getting flamed for it, maybe I can shed some light since I have the other perspective as well. IMO, The reason for not seeing eye to eye is (for example) akin to saying "word problems are math". (Thinking of a grade school word problems for common reference). Yes, they are readily mapped to mathematical models that can solve the word problem & perhaps almost indistinguishably so. Though no - word problems are not math. Word problems are a series of phrases and words. That's where the talking past each other comes in... Different interpretations of "word problems are math", or "code is math". It's seemingly not clear whether we are talking about logical 'implies', 'element of', or 'equals'.
Which goes to "We can perfectly describe the CRUD operations with set theory.", we all agree there. That is not readily conveyed though when writing things like "code is math".
> That's where the talking past each other comes in...
Then sorry, but that's your own damn fault. I was clear about my definition and quoted a famous mathematician to give some authority, to not be "trust me even though I'm some rando". The way to respond to that is not "you're wrong, trust me, I'm some rando".Yes, I agree we're misunderstanding each other because we're using different definitions but "you've" rejected "mine" without defining "yours" and expecting everyone to understand. Of course that'll lead to confusion. You can reject the mathematicians definition of math, but you sure gotta say more than "trust me" and it's a pretty wild thing to do, especially as non mathematicians.
The problem here is one side who's dabbled in cooking says "a chef makes desserts" and chefs are responding "we do a lot more than that". Maybe there's a few chefs that go "yeah all I do is dessert" and everyone points to that while ignoring the second part of their sentence is "but that's just my speciality." Wouldn't you think that conversation is insane? The only reason it's obviously so is because we all know what a chef is and agree on the definition. But who is better qualified to define the chef's job? The chef or consumer?
In another thread, you characterized my response as stating: " ¬(A ↦ B) ⟹ ¬(B ↦ A)" (and this is a great example of language not being math, but math being language!). That was not at all my claim.
My claim is "I believe you are saying 'A = B'. It appears that 'B != A', therefore 'A != B'." My only claims are
(1) I believe you are writing to convey that you mean Math IS Language in the sense they are equal, identical, interchangeable, and fully equivalent, and bi-directionally so
(2) that: B != A
The only results can either be:
- "yeah, because B != A, the statement A = B is not true"
- Your claim (1) is false, I'm not actually saying "A = B"
- Your claim (2) is false, "B = A" is in fact true. I would find that to be an interesting assertion and would have loved to explore more why you think that.
That is a good analogy, except programming languages are formal languages, not natural languages.
> they are readily mapped to mathematical models
With code we are not talking about something that is mapped to mathematical models. Code is not modelled by mathematics, it is defined by mathematics (denotational semantics and operational semantics).
"Code is math" is true in a theoretical sense. Practically, coding doesn't always feel like what is commonly thought of as "doing mathematics", but it is essentially mathematical.
Actually I'm starting to wonder whether the thing that made university math easy for me was I quickly developed a good internal "type checker"/"type inference engine" in my head, and that helped make the next steps of proofs seem straightforward.
[0] https://gist.github.com/ndriscoll/881c4f5f0398039a3a74543450...
Though middle-school or not, it's still math.
For anything CRUD, you put the data into a database. The underlying database is relational, or a key->store.
If it's relational, that's one branch of mathematics. If it's another kind of database, it's another branch of mathematics. Mathematics is extensive, and covers more things you can imagine at a glance.
The main difference between writing mathematics and programming, and this applies to any form of programming, is that in mathematics writing a formal proof is amazingly close to: you write the program, and you are also the compiler, and you are the CPU, performing all operations yourself. With programming you only have to write the program.
Source: On one hand I have studied pure mathematics (not the simplified applied mathematics that are taught in engineering, which is mostly equations), on the other hand I have been working as a software developer for over 15 years.
This is vague and doesn't mean anything. People can't even agree what 'objects' are and people did a lot of programming before the term 'object' was invented.
Programming is about is fundamentally about instructions and data. Yin and yang of two things that are completely different, even if people can occasionally mix them and conflate them.
I'm surprised I hit a nerve with so many people. I'm quoting someone who's considered one of the greatest mathematicians. Obviously I don't know the backgrounds of people but it seems like programmers have strong opinions on what math is that disagrees with what mathematicians say they do.
At what point does the abstract description just not offer any useful insight anymore?
I'm surprised I hit a nerve with so many people.
A lot of people had very good explanations for why you're pretty far off in trying to say two things are the same.
Programming is much more like building a machine than any sort of straight math. There is state, there is interactivity, there is performance, there is IO and there are real world implications to all of it.
Saying they are the same is like saying gardening is plumbing just because you sometimes use a hose.
> [No,] it's not math it's 'category theory'
That's a wild claim considering Category Theory is a branch of mathematics | Category theory is a general theory of mathematical structures and their relations.
- https://en.wikipedia.org/wiki/Category_theory
It is necessary that you provide an alternative definition as to what "category theory" is, though I suspect it will make many category theorists and mathematicians upset. > A lot of people had
A lot of non-mathematicians disagreed with mathematicians.
https://news.ycombinator.com/item?id=43882197It's not a wild claim since you misquoted me.
A lot of non-mathematicians disagreed with mathematicians.
Mathematicians can claim whatever they want, when it comes to programming, programmers understand it better and they're trying to explain to you why this is nonsense. Vanderbilt claims to be "the harvard of the south" but wouldn't you know it, harvard doesn't claim to be "the vanderbilt of the north".
Show me programming languages designed by mathematicians to be 'mathematically pure' and I'll show you a language that hasn't been used to ship software that people want to use.
In other words: It's math in a sense that for most of us with a computer science background is often not very relevant to how we work.
... In my experience, learning to write one component at a time (and try the code, and make sure it works before proceeding) is itself a skill that many struggle to develop. Similarly for avoiding unnecessary dependencies between components. Oh, and also being able to analyze the problem and identify separable components.
One of the most frustrating things about teaching programming, for me, is the constant insistence from other teachers that you have to maintain an "absolutely everyone can learn to program" attitude at all times. Many people who start to learn programming have misguided or confused reasons for doing so and - to say the least - could make much more effective use of their time developing other skills. (It's not a question of elitism; I'd surely flounder at some tasks that others find natural.)
I very much think many people could learn the more advanced Excel Formulas, Power Automate and even simple Bash/PowerShell scripting to make their work more effective. I've met quite a few folks who had been intimidated out of trying who could do it.
On the other hand, how many people on this site could bootstrap a linux kernel on either very new or very old hardware? I know there are some, but they are certainly not the majority. I certainly won't be the first person to get linux and doom to run on a quantum computer.
But that is similar to other professions. Everyone with a largely functioning body can learn to turn a few planks and some metal parts into a functional shed door with some basic tools or to put up a decent brick wall that won't topple over in a month.
That doesn't mean everyone is qualified to pour concrete for a dam or a bridge foundation, or to re-do some historical work in original style.
It's shocking how little physical and spatial ability some people have - that is definitely not true. Sometimes it might be a personal discount or lack of confidence, but this remains true regardless of the cause.
> That doesn't mean everyone is qualified to pour concrete for a dam or a bridge foundation, or to re-do some historical work in original style.
Exactly!
I think statements like that are more concerned with philosophy than reality. Any discussion surrounding topics like this typically ends up being a discussion around definitions.
I believe the vast majority of human beings are capable of learning how to program in the most extreme elementary sense of the word. As in, outside of severe disabilities or complete and utter inaccessibility to circumstances in which one could learn program, then I think the remaining population of people could learn to program to some degree. Obviously, not everyone will learn to program due to a near infinite number of reasons.
I would argue it's like music. Anyone can make 'music.' Just make a sound -- any sound. The difference between noise and music is subjective. I would not argue that everyone could be the next Lovelace, Turning, Ritchie, Thompson, Torvalds, etc..
Now, for my jaded opinion, I think a lot of the "everyone can learn to program" talk does not come from a place of desire to share the gift of knowledge and joy of programming. I think it's more of a subtle way to encourage people to go into programming so that they may be hired by mega corps. In order to keep the Capitalist machine running. It's like the National Hockey League's slogan, "Hockey is for everyone." That is just a fancy way of saying, "everyone's money can be spent on the NHL."
I might be capable of learning advanced accounting, but that sounds like torture to me and I'll be damned if I'll ever take that on! I'm sure programming feels like that to a wide variety of people, and I don't see any need for us to try to pretend otherwise - outside of a bizarre ideological desire for equivalent outcomes from disparate groups.
I'm sure everyone is capable of learning some basic level of programming, just as they are able to learn a basic (high school) level of any subject. However, not everyone is going to have the aptitude to take that to an advanced professional level, however hard they try. We're not all cut out to be artists, or writers, or doctors, or scientists, or developers, etc.
Personally I've always considered a solid grasp of algebra to be the minimum bar for being able to program, at least for anything that isn't utterly trivial. Being able to take a word problem and turn it into a system of equations and solve it is a pretty close analog to being able to take some sort of problem or business requirement and turn it into code.
And the sad truth is that a huge percentage of the population struggle with just arithmetic, let alone algebra.
After collecting data for a few semesters he concluded his students could be clearly divided into three categories: those who just "got" programming, those who understood it after working hard, and a small group that just didn't grasp regardless of effort.
Computation is following an algorithm. e.g. long division or computing a derivative.
Intuition, AKA brilliance, is finding a non-obvious solution. Think "solving an NP problem without brute force"*. e.g. solving an integral (in a form that hasn't already been memorized) or discovering an interesting proof.
Organization is recording information in a way that a) is easy for you to recall later on (and get insights from) and b) is digestible by others**. e.g. explaining how to compute a derivative, solve an integral, or anything else.
Math, programming, and writing each require all skills. The kind of math taught in school (e.g. long division) and your boring jobs are primarily computation. I believe advanced math (e.g. calculus) is primarily intuition; it requires some organization because big theories are broken into smaller steps, but seems to mostly involve smart people "banging their head against the wall" to solve problems that are still quite unclear***. Programming is primarily organization. It requires some intuition (I think this is why some people seemingly can't learn to code), but in contrast to math, most programs can be broken into many relatively-simple features. IMO implementing all the features and interactions between them without creating a buggy, verbose, and unmaintainable codebase is programming's real challenge. Writing is also primarily organization, but finding interesting ideas requires intuition, and worldbuilding requires computation (even in fiction, there must be some coherence or people won't like your work).
> Some people are really good at mucking around garbage code (they have no choice, they get paid to), but what part of programming did they get good at? Obviously, some part of it, but nothing to write home about.
I agree that work you find boring should be avoided, and I also try to avoid working with it. But some people really seem to like working on esoteric code, and I think there are some skills (beyond computation) developed from it, that even apply when working with good code. Building a mental model of a spaghetti codebase involves organization, and if the codebase uses "genius hacks", intuition. Moreover, the same techniques to discern that two code segments in completely different locations are tightly coupled, may also discern that two seemingly-separate ideas have some connection, leading to an "intuitive" discovery. There's an MIT lecture somewhere that describes how a smart student found interesting work in a factory, and I think ended up optimizing the factory; the lesson was that you can gain some amount of knowledge and growth from pretty much any experience, and sometimes there's a lot of opportunity where you'd least expect it.
* Or maybe it is just brute force but people with this skill ("geniuses") do it very fast.
** These are kind of two separate skills but they're similar. Moreover, b) is more important because it's necessary for problems too large for one person to solve, and it implies a).
*** And whatever method solves these problems doesn't seem to be simplification, because many theories and proofs were initially written down very obtuse, then simplified later.
Usually even deciding what the problem is is in part an art, requires an act of narrativization, to shape and form concepts of origin, movement, and destination.
A good problem solver has a very wide range of abstract ideas and concepts and concrete tools they can use to model and explain problem, solution, & destination. Sometimes raw computational intellect can arrive at stunningly good proposals, can see brilliant paths through. But more often, my gut tells me it's about having a breadth of exposure, to different techniques and tools, and being someone who can both see a vast number of ways to tackle a situation, and being able to see tradeoffs in approaches, being able to weight long and short term impacts.
> Its very rare imo that computational problems emerge fully formed & ready to be tackled like proofs.
In my generation, the perfect example is Python's Timsort. It is an modest improvement upon prior sorting algorithms, but it has come to dominate. And, frankly, in terms of computer science history, it was discovered very late. The paper was written in 1993, but the first major, high-impact open source implementation was not written until 2003. Ref: https://en.wikipedia.org/wiki/TimsortIt has been reimplemented in a wide variety of languages today. I look forward to the next iteration: WolfgangSort or FatimaSort or XiaomiSort or whatever.
I absolutely value & have huge respect for the deeply computational works that advance us all along!
But this is an exceedingly rare event. Most development is more glue work than advancing computional fundamentals. Very very very little of the industry is paid to work on honing data structures so generally.
"Good code" is very subjective. Even readability and modularity can be taken too far.
> concepts that are intrinsically complicated,
I'm not a mathematician, but I figure mathematicians aim for clean, composable abstractions the same way programmers do. Something complicated, not just complex in its interactions with other things, seems more useful as a bespoke tool (e.g. in a proof) than as a general purpose object?
> Whereas programming has more “small steps”: even someone who’s not smart (but has grit) can write an impressive program, if they write one component at a time and there aren’t too many components that rely on each other.
This is well put. I often wonder if a merely average working memory might be a benefit to (or at least may place a lower bound on the output quality of) a programmer tasked with writing maintainable code. You cannot possibly deliver working spaghetti if you can't recall what you wrote three minutes ago.
This is a baldly self-serving hypothesis.
Forth programmers make a similar point. Forth is stack based; you typically use stack operations rather than local variables. This is ok when your 'words' (analogous to functions/procedures) have short and simple definitions, but code can quickly become unreadable if they don't. In this way, the language strongly nudges the programmer toward developing composable words with simple definitions.
(Of course, Forth sees little use today, and it hasn't won over the masses with its approach, but the broader point stands.)
Strongly disagree.
There are plenty of cases where non-modular code is more readable and faster than modular code (littered, presumably, with invocations of modular logic).
There are also countless cases, particularly in low-level languages or languages with manual memory management, where the best solution -- or the correct solution -- is far from readable.
Readability is anyways in the eye of the beholder (My code is readable. Isn't yours?) and takes a back seat to formal requirements.
Just as a matter of personal style, I prefer long, well-organized functions that are minimally modular. In my experience, context changes -- file boundaries, new function contexts, etc -- are the single greatest source of complexity and bugs, once one has shipped a piece of code and needs to maintain it or build on it. Modular code, by multiplying those, tends to obscure complexity and augment organizational problems by making them harder to reason about and fix. Longer functions certainty feel less readable initially but I'd wager they produce better, clearer mental models, leading to better solutions.
I'd say that readability, which often boils down to consistency, and modularity are ways to do this, but they aren't the only ways. And as you say, sometimes there's a need for "unreadable" code, so not everything can be easy.
I've always found the car metaphor to work very good to understand this: A car is a machine that can transport itself to point A to B (some other rules apply). There are different types of cars, but you certainly haven't understood the definition of you say that something is not a car because is not a Volvo, or because it doesn't look like a Ford, when it's clearly able to transport itself.
Math is the study of ideal objects and the way they behave or are related to each others. We have many branches of mathematics because people have invented so many objects and rules to play with them. Programming is nothing if not this very definition. The fact that you don't have to "use math" when programming is not really addressing the point, it's like saying a car is not a car because it has no discernible brand.
"Time is not a thing in math" is not understanding what math is. Time is another ideal object following certain rules under a given domain. Programming is coming up with objects of different size, with different characteristics, with interact at different points in time, i.e. following certain rules.
The study itself claims:
- fluid reasoning and working-memory capacity explained 34%
- language aptitude (17%)
- resting-state EEG power in beta and low-gamma bands (10%)
- numeracy (2%)
They take math skills to equal numeracy. The study itself implies this too. I disagree on a fundamental level with that. Math skills align much more closely to fluid reasoning than to numeracy.
In small peer groups (“pods”) that debug and learn together, communication becomes a core skill—and that can actually change how math skills are applied and developed. Language doesn’t just support learning; it reshapes the process.
This has the effect of making programming easier, but don't confuse it.
English majors.
It'd be interesting to see correlations (language brain vs math brain) for how easy or hard it is for people to solve new problems with language after they already know the basics.
Because, if you do what we do, it's obvious that language > math for most of this stuff.
I'd go so far as to say that music > math for programming.
Code written by programmers with humanities backgrounds is easily identifiable as being of bad quality.
Kind of like vibe coding meets programming by accident, before vibe coding was really a term.
It's a faux pas to even mention this IRL, but good coders know what's up.
Those "coders" usually get promoted to people managers, which is usually what they want anyway because their self-worth relies on abusing others to mitigate the correct self-perception they have of being "inferior".
The problem is, things need to be solved and vibe+accident programming can only go so far.
But fear not, they can always scapegoat whoever solves the problems, because if they were not to blame, how could they know what was up or even feel the need to correct it?
Even if many of the bad coders are those who were in humanities and don't have coding experience because they just entered the field (because once you get it you are no longer a humanities background)
As a new computer science professor at a community college, this is a timely article for me that may end up influencing how I teach, especially my introductory programming course as well as my discrete mathematics course.
That side, I wonder if early programming was much more math heavy, and higher level languages have successively reduced that need over time.
Mathematics itself is a human-made formal language that can be bootstrapped from definitions and axioms of logics and set theory, which have to be given in human language first.
Experienced mathematicians read formal theorems written in Greek letters off their blackboards as if it was normal English, suggesting they think about it like it was just normal English. This is not to say they cannot also view in front of their mental eye visual representations isomorphic with that language if they chose to.
It's an actively stupid fiction for people who don't understand what math is.
This comes off quite judgmental, and doesn't help me understand your actual point. Could you elaborate on the differences, as you see them?
In some sense, an undergraduate math education is akin to learning the “standard library” (in the software engineering sense) of higher mathematics. Most courses start with basic abstractions of some mathematical object and repeatedly construct more and more abstractions on top of those. The structure of those abstractions is similar to how you might build a library. A professional mathematician is expected to be fluent in the mathematical standard library, just like how you might expect an experienced software engineer to be fluent in Python’s standard library.
If this analogy is true, people who can learn Python relatively quickly might be able to also learn higher mathematics relatively quickly under the right pedagogical environment.
This was kind of how math classes worked, but without that explicit phrasing. It would certainly make the analogy between the two activities more obvious. I also wonder whether people would have less trouble with quantifiers if they were phrased in programming terms: a proof of "forall x, p(x)" is a function x=>p(x), and a proof of "there exists x such that p(x)" is a pair (x, p(x)). e.g.
Continuity: (epsilon: R, h: epsilon>0, x_0: R) => (delta: R, h2: (x: R, h3: d(x,x_0) < delta) => d(f(x),f(x_0)) < epsilon)
Uniform continuity: (epsilon: R, h: epsilon>0) => (delta: R, h2: (x: R, x_0: R, h3: d(x,x_0) < delta => d(f(x),f(x_0)) < epsilon))
proof of UC => C = (epsilon, h, x_0) => let delta, h2 = UC(epsilon,h) in (delta, h2(_,x_0,_))
So when you're trying to figure out how to do the proof, it's clear what kind of type you need to return and your IDE could help you with autocomplete based on type inference.
The comment below/above sort of explains it, except I'd argue covering it starts/can start/should start much earlier than at university. Basically, the vast majority of most branches of math, especially pure math, involves very little in the way of calculation. And the way math gets taught (and stupid claims about "math brain") means that many kids who aren't in the top few % of "doing calculation" never get to do the classes where it's less important.
LLMs, trained on words/tokens and symbols and logical combinations of them, are proving to be good at math, and bad at calculation/arithmetic. If an LLM went to school we'd never let it train on real math tokens. It would get shoved in the corner as a "model bad at math" because it was a "model bad at arithmetic".
Poetry is the art of giving different names to the same thing. Mathematics is the art of giving the same name to different things. (Henri Poincaré)
It's basically being good at "naming things", in particular abstractions.
Did Robert Frost say something like, "all language is poetry?" A chair is the term I learned to describe the object I am currently sitting on while typing this comment. Germans might call the same object a 'Stuhl', the French might call it a 'chaise', etc.. My point being that the object isn't technically any of those words, but rather those words symbolically map to that object.
Tell that to the people who designed the study, the people that approved the study, and the people that funded the study.
At the end of the day, the study set out certain criteria for assessing certain skill types/competencies and divided the people by those well defined criteria. I think it’s pretty hard to argue against the idea that people might have at least some level of aptitude for different types of activity and skill.
It leads to different styles of thinking and problem solving.
I think most people would agree that problem solving, expressing ideas verbally, and expressing ideas in math are very different skills.
A good place to look is after brain trauma. Where people end up with weird missing functionality.
Oliver Sacks writes about some of the weird side effects of brain trauma.
We could experiment on mathematicians or writers by causing trauma to different parts of the brain...
Or neural stimulation/anaesthetic of brain regions might show something.
Which... is what math is, too.
First red flag is here. The title rewrote this to be language only. That problem solving skills are relevant is pretty obvious, but language less so.
I’ve been programming for most of my life and I don’t consider myself a very good speaker. My language skills are passable. And learning new languages? Forget it. So I’m skeptical. Let’s look at the study.
First of all, “math” becomes “numeracy”. But I think programming is probably closer to algebra, but even then it’s less strict and easier to debug.
> Assessed using a Rasch-Based Nuemracy Scale which was created by evaluating 18 numeracy questions across multiple measures and determining the 8 most predictive items.
Also, the whole thing is 5 years old now.
To me, problem solving ability is precisely the same as the ability to articulate the problem and a solution. I don't see a major difference.
If you can solve a problem but you can't articulate what the problem is or why the solution will address it, I wouldn't call you a good problem solver. If you can articulate the problem well but not come up with a solution, you're already doing better than a lot of programmers in the world, and I'd probably prefer working with you over someone who presents the solution without "showing their work".
In fact, what is problem solving without such articulation? It's hard to even grasp what the skill means in a raw sense. Arguably creativity in this context is just the ability to reframe a problem in a more approachable manner. Many times, if not most times, such framing implies some obvious solution or sets of solutions with clear tradeoffs.
If you’re debugging, you can get by for a long time by trying things until the compiler shuts up. It’s not efficient or good but people do it.
I have a very difficult time trying to extract the difference between "linguistic ability" and "critical thinking", though:
1. The core difference between "critical thinking" and "uncritical thinking" is the ability to discern incoherency from coherency.
2. Coherency is evaluated at the linguistic level: do the terms bind in meaningful ways to the problem? Do any statements contradict each other?
3. The remaining aspect is "creativity": can you come up with novel ways of approaching the problem? This is the hardest to tie to linguistic ability because it sort of exists outside our ability to operate within an agreed context.
So while I agree these are distinct skills, I still have difficulty identifying what remains in "critical thinking" after linguistic ability is addressed.
The recruiter labels an algorithm problem as a "coding" test, but it's a math test, and concludes that most applicants who claim to be fluent in a programming language can't code and must have lied on their resume.
For context, I don't mind algorithm tests, but I strongly disagree with recruiters presenting it as a coding assessment.
Even if CS is sort of applied mathematics.
Math isn't about calculations/computations, it is about patterns. You get to algebra and think "what are these letters doing in my math" but once you get further you think "what are these numbers doing in my math?"
A great tragedy we have in math education is that we focus so much on calculation. There's tons of useful subjects that are only taught once people get to an undergraduate math degree or grad school despite being understandable by children. The basics of things like group theory, combinatorics, graphs, set theory, category theory, etc. All of these also have herculean levels of depth, but there's plenty of things that formalize our way of thinking yet are easily understandable by children. If you want to see an example, I recommend Visual Group Theory[0]. Math is all about abstraction and for some reason we reserve that till "late in the game". But I can certainly say that getting this stuff accelerates learning and has a profound effect on the way I think. Though an important component of that is ensuring that you really take to heart the abstraction, not getting in your own way by thinking these tools only apply in very specific applications. A lot of people struggle with word problems, but even though they might involve silly situations like having a cousin named Throckmorton or him wanting to buy 500 watermelons, they really are part of that connection from math to reality.
This is why "advanced" math accelerating my learning, because that "level" of math is about teaching you abstractions. Ways to think. These are tremendously helpful even if you do not end up writing down equations. Because, math isn't really about writing down equations. But we do it because it sure helps, especially when shit gets complicated.
[0] https://www.youtube.com/watch?v=UwTQdOop-nU&list=PLwV-9DG53N...
I mean is it any surprise kids get bored in math? They spend years learning the same thing. You spend years learning addition and multiplication (subtraction and division are the same operators). I sure as hell hated math as a kid. Especially doesn't help that it is always taught by someone who's also dispassionate about the subject. You really can get a process going where most middle schoolers are doing calculus and linear algebra (and advance ones are doing this in elementary). It isn't as far fetched as many would believe.
Indeed. Kids did not evolve to learn things because they would be useful years down the road. But they did evolve to inhale knowledge at very high rates wherever it extends their capabilities in the moment.
My take on this is that math should be tied to kids crafting, experimenting, and learned directly in the context of its fun and creative uses.
Geometry screams out to be developed within a context of design, crafting, art and physical puzzles. Algebra, trigonometry, calculus ... they all have direct (and fun) uses, especially at the introduction stage.
The availability for graphics software, sim worlds, 3D printing, etc. should make math, from the simplest to most advanced, more fun and immediately applicable than ever.
Boredom with math is a failure to design education for human beings.
(Then there is the worst crime of all - moving kids through math in cohorts, pushing individuals both faster and slower than they are able to absorb it. Profound failure by design.)
> Indeed. Kids did not evolve to learn things because they would be useful years down the road. But they did evolve to inhale knowledge at very high rates wherever it extends their capabilities in the moment.
I strongly disagree with this. Children play. Most animals play. It is not hard to see how the skills learned through play lead to future utility despite potentially none at the time. We invent new games, new rules, and constantly imagine new worlds. The skills gained from these often have no utility beyond that that is self-constructed. Though many do have future rewards. I think we often miss those though because that path is through generalization. Learning how to throw a ball can help with learning physics, writing, driving, and much more. They don't perfectly transfer but it'd be naive to conclude that there aren't overlaps.I think the truth is that as long as we are unable to predict the future, we really can't predict what skills will be useful and what specific knowledge should be pursued. We can do this in a more abstract sense as we can better predict the near future than far but that also means we should learn creativity and abstraction, as these allow us to adapt to changes. And that is why I believe we evolved these methods, and why you see things like play in most creatures.
I am not arguing against this. On the contrary, I agree with this completely.
Pairing the opportunity to learn and use highly general long term knowledge, toward progression in their own intrinsic interests (crafting, whatever) is the point I am making.
The alternative, a long line of math for math's sake, which neglects to harness and support each child's need and motivation to continually advance their own natural interests in all kinds of idiosyncratic directions.
But if you're willing to hunt, I know that this idea was attempted before[0]. France and USSR had better success than the US. I'm sure there are still people working in this direction. I don't have children, but fwiw I've taught my nieces and nephews algebra and even some of calculus before they were 10 just in visiting time during vacations. They were bored and it seemed more fun than talking about the drama, politics, and religion that the rest of my family likes to spend most of their time on. Kids were similarly disinterested in that stuff lol. I've also seen my god{son,daughter} be able to learn these types of skills, so I'm highly confident it is doable.
I was never told, for example, that matrices are a useful abstraction (shorthand?) for representing linear equations.
Or that imaginary numbers were an invented abstraction that made certain calculations easier.
It would have been nice to learn it from the perspective of how those abstractions came into being and why they're useful.
The patterns of repetitions of single things, repetitions over different things, repetitions between different things, repetitions over repetitions, ...
Natural/counting numbers are just the simplest patterns of repetition.
Congratulations on overcoming the (very weird) choice of whoever taught your intro to linear algebra class.
Regardless, it's a shockingly common occurrence. I'd agree it's not the right way, but it is also a common way. That perspective might define those priors
But yes, I am expressing honest admiration—it was a bad move on the part of the teacher I think, which the poster seems to have overcome!
I went through the same thing, it really isn't easy. But it is also why I don't blame others for not seeing it. It would be hypocritical to do so. I'm just not sure what's the best way to share, I'm open to ideas. Unfortunately we have to contend with priors that we see here, though I'm happy if my efforts even make one more person able to share in this beauty.
But nobody even told us why! And at that time I never thought to ask.
I'm not going to sit here and act like I was a star student or anything. I was more of a class clown type. I absolutely hate math and all things math. That was, until I went to college. A switch flipped when I was in a Calculus II class.
Our professor asked, "What is a 100 divided by 0?" People in the class responded with, "You can't divide by 0 because it's undefined." To which our professor responded with, "Why?" Then a student took out a calculator and showed the professor that the answer as indeed undefined. To which he responded with, "Ok, how do you that calculator is correct? Do you just believe it because people told you that dividing by zero is undefined? Ok, the answer is undefined... but why?"
Right then and there, a switch flipped in my brain. I realized that I was basically institutionalized to just not question math and to just accept that things work a certain way "just because." It was at that point I actually started to become interested in math, and it completely changed my outlook on math in a positive manner. I love math now (despite being horrible at it).
> I’ve always been a straight-A student.
> if something was not written in a book I could not invent it unless it was a rather useless variation of a known theory. More annoyingly, I found it very hard to challenge the status-quo, to question what I had learned.
I don't think this part is isolated to math, but there's a lot of acceptance for "because" being an answer. Being in an authoritative position and being challenged can be frustrating, but I think a lot of that frustration is self-generated. We stifle creativity when young and it should be no surprise that frequently when people challenge, they don't have the tools to do so (or be receptive) effectively. Truthfully, "I don't know"[1] still shuts down the conversation.But I think people themselves are uncomfortable with not knowing. I know I am! But that isn't a feeling of shame, it is a feeling that creates drive.
[0] https://thomwolf.io/blog/scientific-ai.html
[1] Alternatively, variations like: "That's a good question, I don't know" or "Great question, but we don't have the tools to address that yet". The last one need not undermine your authority either. Truthfully, the common responses resulted in me becoming strongly anti-authoritarian. Which, also has greatly benefited my career as a researcher. So, thanks lol
I guess I've been lucky with my math teachers.
I have the same experience as you did, in that studying by myself abstract algebra accelerated my learning and reasoning skills.
I don't think this experience is uncommon (among people who get to these levels). Which is why it is really sad. Especially given how linear algebra and abstract algebra are in a lot of ways easier than calculus. I also think they should be taught earlier purely due to the fact that they teach abstraction and reasoning.
Language also concerns form. Grammar has form. Concepts are forms.
Math is language. 'Everything' is language. Language is the image of reality.
In the beginning was the Logos...
> Math is language.
Facts. I think it is hard to disagree with this, and it seems like a rare opinion to be held by people at high levels. > Language is the image of reality.
An image?[0] ;) > Math major here
Forgive me if I doubt, but your comment would strongly suggest otherwise, along with this one[0].The reason for doubt is a failure in fairly basic logic. Your claim is:
¬(A ↦ B) ⟹ ¬(B ↦ A)
I'd expect anyone willing to claim the title of "math major" is aware of structures other than bijections and isomorphisms. The mapping operator doesn't form an abelian group.This is not number theory, not set theory, just logic. I don't see how the properties of Abelian groups applies. I do suspect that non-abelian groups is a case where "¬(A ↦ B) ⟹ ¬(B ↦ A)" is false, and would be a contadiction to my counter claim if "¬(A ↦ B) ⟹ ¬(B ↦ A)" was my counter-claim. No, my counter claim is simply that because "B != A" therefore "A != B"
The statement "¬(A ↦ B) ⟹ ¬(B ↦ A)" is quite different, and I'm not 100% sure whether you are mixing set and logic notations? [0][1] It does seem you are bringing in number theory concepts, or we are misunderstanding one another?
I assume "↦" is the set mapping operator and "⟹" is logical implies, and "¬" is logical not. "¬(A ↦ B) ⟹ ¬(B ↦ A)" could potentially be phrased as: "If the element A cannot be mapped to B, then the element B cannot be mapped to A". I would agree that statement is not 'generally' true. Could you please clarify so we are not talking past each other.
[0] Per google: The symbol "↦" is a mapping symbol, typically used to indicate a function or relationship where one element maps to another. It's not a standard logical operator in the same way that symbols like &&, ||, or ¬ are. Instead, it represents a directional relationship between sets or elements, often seen in set theory and mathematical notations
Try that in a philosophy class and you can expect an F.
A math class too.
The simple refutation was: because B != A, therefore A != B
That WAS the math exam!! No group or set theory needed. It's simple and this is not interesting.
That's only in your head. Inventing claims so that you can pretend other people are wrong isn't a good move.
> That WAS the math exam!! No group or set theory needed. It's simple and this is not interesting.
I'm surprised you can pass a math class.
I'm not the only one that interprets a statement as "Math is Language" to be of the form "A = B" where "math" is A, "language" is B, and "is" is the equals operator, see: https://news.ycombinator.com/item?id=43874322
Seems kinda simple.. You're engaging in a almost pure personal attacks. Care to address the substance of how the 3 word long statement is not of the form "A = B", but is of some other different form? In which case, perhaps you can guide the conversation for why it is an interesting statement or not? If you want to change all the givens and use your own definitions, please provide those. Without common ground, this remains uninteresting.
This appears to be a link that contains zero support for your sentence. Neither the comment you linked nor the response below it features such an interpretation.
> Seems kinda simple.. You're engaging in a almost pure personal attacks. Care to address the substance of how the 3 word long statement is not of the form "A = B", but is of some other different form?
You're in luck! I've already provided that material, and you responded to it. For further discussion, you'd need to have a better working understanding of English.
> This appears to be a link that contains zero support for your sentence. Neither the comment you linked nor the response below it features such an interpretation.
It's interesting, because there is a disagreement about whether there is a paradox or not. There is a paradox if you take my perspective (that an equals relationship is being expressed), but none if an implication relationship is assumed.
These two sentences:
- "By that same logic you could also say that language is math"
- "Not quite, but the inverse is true."
The "not quite" says that "language is math" is not true. So we have "A = B", but "B != A", which is a paradox. OTOH it's not a paradox if what is actually being said is "A => B" but "B !=> A"
> You're in luck! I've already provided that material, and you responded to it.
Hello bad faith! I hope you are doing well today. Let's end the conversation here.
> No, my counter claim is simply that because "B != A" therefore "A != B"
We all know that this is not always true. It may be true in certain cases, in certain fields (pun intended), but you know that this is such a basic logical fallacy that it is taught to children.Here's a counterexample so we can lay this to rest.
Let "A" be "a square"
Let "B" be "a rectangle"
B != A -> "A rectangle is not a square" (True)
A != B -> "A square is not a rectangle" (False)
Stop cosplaying, stop trolling, stop acting in bad faith.Did you even look at my name? There's 2 people that should come to mind. Certainly any logician would notice.
Or, "Properties of Equality... The [equality] relation must also be symmetric. If two terms refer to the same thing, it does not matter which one we write first in an equation. ∀x.∀y.(x=y ⇒ y=x)" [2]
The 'is' relationship in logic is understood to be equality. The 'is a' relationship in logic is subset. Colloquially, the word "is" can be either one though.
I notice in your counter example you swapped the "is" relationship with "is a". Keeping the "is" relationship: "A rectangle is not square" (true generally, but false for specific cases for rectangles). That is really the distinction, the sometimes true vs always true.
Let's stick to the precise definition of "is" to mean a logical equality from here on please, and be precise when we use "is" vs "is a" and never infer "is" to actually mean "is a".
So, with "Math is a language" vs "Math is language". Which do you mean?
[1] https://en.wikipedia.org/wiki/Equality_(mathematics)
[2] http://logic.stanford.edu/intrologic/extras/equality.html
You say that like you think it's a qualification?
You're not making it look good. A common use of be is to express set membership rather than identity. For example, "two is an even number" or "tigers are cats".
We may hope that one day you'll come to realize that set membership is not reflexive, and - more to the point - also not symmetric.
If we do want to talk sets, that seems far more interesting. The statements like "Math is a language", or "Math has equivalence classes within languages", or "Mathematics are a Language" are slightly more interesting to consider IMO.
>> Math major here.
> You say that like you think it's a qualification?
Agree, appeal to authority fallacy. I take that over mis-framing any day though.
Any old-timers might appreciate that we're arguing over the meaning of the word "is" =D
Google for "logical "is a" vs logical "is"".
Google AI answers this:
> "is" typically represents an equality relation
Rest is from the AI response:
In logic, "is" typically represents an equality relation, while "is a" (or "is of the type") represents an inclusion relation. "Is" indicates that two things are the same or identical, while "is a" indicates that one thing is a member of a larger class or set of things.
Logical "is" (equality):
Meaning:
"A is B" means that A and B are the same thing, or have the same properties.
Example:
"The Eiffel Tower is in Paris" (the Eiffel Tower and the thing in Paris are the same thing).
Logical "is a" (inclusion or type): Meaning: "A is a B" means that A belongs to the category or class of things that are B.
Example: "A dog is an animal" (dogs are a type of animal).
What does "silk is cloth" mean?
I'll note, I have not asked any questions other than (paraphrasing): "what do you mean precisely?" To which, I have not gotten any answers other than trolling and flaming; and examples that all conveniently swap "is" with "is a".
The conclusion of the study was that linguistic aptitude seemed to be more correlated with programming aptitude than mathematical aptitude, which seems fairly interesting, and also fairly unconcerned with which specific physical regions in the brain might happen to be involved.
> The conclusion of the study was that linguistic aptitude seemed to be more correlated with programming aptitude than mathematical aptitude
And this is what I'm pushing back against and where I think you've misinterpreted. > They found that how well students learned Python was mostly explained by general cognitive abilities (problem solving and working memory), while how quickly they learned was explained by both general cognitive skills and language aptitude.
I made the claim that these are in fact math skills, but most people confuse with arithmetic. Math is a language. It is a language we created to help with abstraction. Code is math. There's no question about this. Go look into lambda calculus and the Church-Turing Thesis. There is much more in this direction too. And of course, we should have a clear connection to connect it all if you're able to see some abstraction.Language is not math, therefore math is not language.
There is no problem with A -> B ∧ B -/-> A
Here's an example. "I live in San Francisco" would imply "I live in the US". But "I live in the US" does not mean "I live in San Francisco".
Here's a more formal representation of this: https://en.wikipedia.org/wiki/Bijection,_injection_and_surje...
The statement "Math is Language", where A is Math and B is Language, maps to the logical assertion: "A = B".
If we are going to really be kinda twisty and non-standard, we could interpret the english "is" to be "is an equivalence class of". Which would map to your example pretty well: language is indeed an equivalence class of math, but math is not an equivalence class of language. Though, nobody is talking about implies operator or equivalence class here.. It's a "is" relationship, logical *equals*
It very obviously doesn't. A square is a rectangle. seadan83 is (probably) a mammal. Math is a language.
Find examples with two singular nouns and just the word 'is'.
The phrase in question: 'Math is language' is an example, or something like 'food is love' is too. I concede you could interpret those last few sentences with poetic license to be read more like: "A is a form of B", or "A is a B" - though that is not what was written and this is not a place to expect that much poetic license.
*edit*: a minute later, thought of a good example. "ice is water". True that "ice is a form of water", but strictly speaking no, "ice is not water". I'll concede there could exist an implied "is a", or an implied "is a form of", but that is poetic license IMO.
[0] Google AI summarized it pretty well: google "logical "is a" vs logical "is"
> In logic, "is" typically represents an equality relation, while "is a" (or "is of the type") represents an inclusion relation. "Is" indicates that two things are the same or identical, while "is a" indicates that one thing is a member of a larger class or set of things
Well, what you reacted to was, let me copy'n'paste, "Math is a language". It was you who insisted that "is" in this sentence maps to "equals" relation, so thanks for agreeing that you were wrong.
The problem here all comes down to seadan83 acting in bad faith and using an intentional misinterpretation of my words in order to fit them to their conclusion. I'm not going to entertain them more because I won't play such a pointless game. The ambiguity of written and spoken language always allows for such abuse. So either they are a bad faith actor "having fun" (trolling) finding intentional misinterpretations to frustrate those who wish to act in good faith or they are dumb. Personally, I don't think they're dumb.
But then again, isn't a good portion of this thread non-mathematicians arguing about what math is? I really thought ndriscoll put it succinctly[0]
> It's like trying to argue about the distinction between U(1), the complex numbers with magnitude 1, and the unit circle, and getting upset when the mathematicians say "those are 3 names for the same thing".
I fear the day some of these people learn about Topology.No, a good chunk is clarification of "WTF do you mean?"
The abstract arguing I suspect we all find to not be interesting and absurd. Let's go to substance here..
The article has stated there is evidence that the math related regions of the brain are not nearly as heavily used when coding as compared to the language regions. The "mathematicians" seem to be arguing that this can't be true because coding and math are so closely related.
This is why the article and evidence are interesting. Coding and math are clearly and very closely related in many ways. Yet, the way the brain handles and interprets coding is more akin to pure language, than it is to pure math.
Which I suppose makes it all the more interesting that Math, Language, and coding are so related, yet (per the evidence and the article) - the brain does not see it that way.
Agree.
> He's right, "is" maps to "equivalent" but he's also wrong because "is" also maps to "subset" and several other things. "Is" is a surjection.
I agree. So, why can't either interpretation be valid? Perhaps, because one is obviously not true? Yet, it seemed like there was a clarification that the obviously not true relationship was the intended one!!!
Godelski previously wrote: "Coding IS math. Not "coding uses math".
I interpreted that clarification to mean you intended "is" to be a strict "is". Particularly given the other context and discussion of "is a" in other threads. I suspect now you were perhaps emphasizing "uses a" vs "is a", rather than "uses a" vs "is". Not a satisfying conclusion here. It would be a lot more interesting if the precision could have been there and had we been able to instead talk about whether all coding languages form an abstract algebra or not. Or perhaps use that line of reasoning to explain why all coding is a form of math. That would have been far more interesting..
I'm sorry the conversation got so caught up on pedantics.
Previously I would have quite readily agreed that at least "coding is a subset of math" - now I'd only agree in the sense that coding is an applied math, just like Physics is applied Math.
So, it does seem to be clearly a 'uses' relationship, and I'll support the assertion. To explain, coding is the act of creating a series of boolean expression (governed by boolean algebra) to create a desired output from a given input. To really explain, code is translated to assembly, which is then translated to binary, which then directly maps to how electrical signals flow out of CPU registers into a series of logical circuits. Assuming no faulty circuits, that flow is completely governed by boolean algebra. We therefore use boolean algebra to create our programs, we define a series of boolean operations to achieve a certain goal. We are _using_ boolean algebra to arrange a series of operations that maps a given set of inputs to a desired output. In the colloquial sense, coding is applied math, it is not pure math though. We use boolean algebra to create our programs, the programs are not boolean algebra themselves, but an application of boolean algebra.
Now, tying it all back to the article and implications. The data collected stated that the language parts of the brain are more responsible for whether we are able to learn programming. That seems to imply that the math part of programming is so far abstracted, that the parts of the brain which are used for math are no longer the most salient.
I wonder how the experiments and results in the article would have gone had the topic been electrical circuits and electrical engineering, which is far closer to the underlying math than coding.
There are other discussions which say:
- Math is a subset of language, surely
- It's easily argued that languages are subsets of math.
Given that context, the distinction seems to be very important.
I find the following idea (paraphrasing) to be very interesting: "not only is math a subset of language, but the language and math are equal sets." I also think it's not true, but am curious how a person would support this assertion. So, my challenge is, because the logical "is" relationship is reflexive and the reflexive property does not hold here - how can this be true? The most satisfying answer has been (paraphrasing) "cause I'm using non-precise language and you should just infer what I meant." Which is fine I guess..
From my experience, my ability to articulate myself well is bound up with my ability to abstract and detect patterns. It is the same thing I apply to crafting software, the same thing I apply to creating visual art.
I think high-cognitive-ability people segregating themselves into artsy vs mathy people has more to do with their experiences in their formative years.
At the same time, the study excluded "five participants ... due to attrition (not completing the training sessions), and one participant ... because he was an extreme outlier in learning rate (>3 sd away from the mean)." I mean, if you are to exclude 15% of your subjects without looking at their aptitude (maybe they didn't do it because it was too hard to pass the training tests to move to the next lesson, yet their language aptitude is high?), with only 36 subjects of which 21 are female (it's obvious programming is male dominated, so they only had 15 males: maybe it doesn't matter, but maybe it does), how can you claim any statistical significance with such small numbers?
> the original study uses "numeracy"
That's fair, although I don't think it changes my response. And the article still really leads to the wrong conclusions. You want to teach children abstraction and reasoning? You teach them math. Not numeracy, math.Ultimately, I believe basic algebra and geometry are the most important takeaways from math classes for most people.
[1]: https://www.hs.fi/tiede/art-2000004823594.html (sorry, it's in Finnish and behind a paywall)
As it was explained to me, one wouldn't take a "Calculus I" class as a prerequisite for say an entry-level engineering course. One typically had such a strong foundation of algebra, that when encountering a problem that required calculus, the student would just learn the necessary calculus at that point in time. In other words, with such a strong algebraic background, other aspects of math, within reason, were much easier to grok.
> the results weren't that good; it proved to be too abstract for young kids
You cannot make that conclusion as a result of the evidence. Yes, the evidence might support that conclusion, but there are many others that also could. For example, they could have just been really bad at teaching. This even seems like a likely one as it is difficult to perform such a reformulation and to do so broadly and quickly.The other reason I'm willing to accept alternative conclusions is that France and the USSR had far more success than Finland (or even America). Their success contradicts a claim that "[it is] too abstract for young kids". You'd need to constrain it to something like "[it is] too abstract for Finish kids" which I think both of us would doubt such a claim.
However, curious of what you base your claim "France and the USSR had far more success" on?
It seems plainly obvious that this language just means “areas of brain that activate when dealing with math problems” vs “areas of brain that activate when dealing with language problems” and yes there is hard evidence that there is a difference between them.
The fact that we see separate patterns of brain activity is the interesting part not any semantic argument about what you or mathematicians feel should be defined as being math or not.
I feel weird answering even if I infer the right question because it feels tautological. If you have Wernicke's Aphasia does it not create the possibility that I already have but your condition resulted in misunderstanding. Given the condition does it not create a high probability that a response will similarly be misunderstood? Is not Anosognosia quite common?
Maybe I'm really misunderstanding but honestly I'm not sure what you're asking
We had basically 4 tracks: one ended with you doing algebra 1 in senior year, another ended with you doing trig in sr year, yet another that ended with trig (no precalculus), and then one that ended with you doing trig and precalc. That final class then had further subdivisions that were too small to have their own full classes: some kids just did precalc, some did calc 1 and took the AP Calculus AB exam and/or IB Math SL, while even even smaller group took AP Calculus BC and/or IB Math HL. The total number of kids who took the AP Calc AB exam in my year was 20ish, out of a graduating class of 500-600.
> the copious amount of tiering of math classes seemed to indicate to me that either we're really bad at teaching math to 80% of people, or only 20% of people will be able to handle precalculus
Or maybe it indicates that the people designing this system should be fired? Job security through complexity?
(Or maybe I’m just biased by the system I know… I’m just asking questions)
> … I’m just asking questions
I'm not disagreeing, but just wanted to point out that this phrasing is commonly used by bad faith actors. I'm not saying you're using it this way and I legitimately do not think you are. But I wanted to point it out because I think your comment could be interpreted another way and this phrase might be evidence for someone to make that conclusion.The classic example is conspiracy theorists. But lots of bad faith actors also use it to create leading questions that generally ignore or lead away from important context. Again, I do not think you're doing this. The rest of your comment makes me think you're acting in good faith.
Math classes weren't separated by grade. So, I took Algebra 2 in 8th grade alongside a cross section of the school; there was one other 8th grader, two seniors, and a selection of people in between.
There was no path to take the AP Calculus AB. Trig / Calc A was offered as two semesters, and then Calc B / Calc C was offered as two more semesters, after which you'd take the BC test. There was also no such thing as "precalculus". Trig followed Algebra II.
In my Calc C class, there were probably 8ish people, of which one or two (besides me) would have been in my grade.
Did you even read beyond the silly headline?
The article itself is about pre-testing subjects on a range of capabilities from problem solving ability to second (foreign) language learning ability, and then seeing how these correlated to the ability of the test subjects to learn to code.
The results were basically exactly what might be expected - people who learned Python the quickest were those who scored the best at learning a second language, and those who learned to wield it the best were those who had scored the best at problem solving.
Not surprisingly math ability wasn't much of a predictor since programming has little to nothing to do with math.
> Did you even read beyond the silly headline?
Yes. I'll also refer you to the HN guideline on this manner. You're welcome to disagree with me but you must communicate in good faith and unless you have a very specific reason for thinking I didn't "RTFM" then don't make the accusation.I'm happy to continue discussing, but only on those terms. In fact, I think we're in far more agreement than your tone suggests. But I think you missed the crux of my point: math isn't number crunching
Your response opened with addressing the headline, which anyone who had "RTFM" (RTFA) would have realized was unrelated to the body of the article. You then veered off into a tangent about the nature of math which again was not addressing the content of the article.
The underlying nature article, linked from the posted story, makes it even more clear what is being discussed, with the abstract stating:
> This experiment employed an individual differences approach to test the hypothesis that learning modern programming languages resembles second “natural” language learning in adulthood.
“Music class is where we take out our staff paper, our teacher puts some notes on the board, and we copy them or transpose them into a different key. We have to make sure to get the clefs and key signatures right, and our teacher is very picky about making sure we fill in our quarter-notes completely. One time we had a chromatic scale problem and I did it right, but the teacher gave me no credit because I had the stems pointing the wrong way.”
It is a great read!
It's surprisingly common. Case in point: "The unreasonable effectiveness of mathematics in the natural sciences".
A normal person wouldn't be surprised that describing how something works is a good way to understand it.
One could argue that all we do is turn thoughts, senses, and memories into further thoughts and appropriate actions, which is applying a pattern, which is math. But at that point the definition is too broad to be helpful for anything but playing word games.
But yes, math is very broad. But it doesn't cover everything. At least not at this time.
Oh wait neuroscientists, explains it all. A statisticians favourite target for being unable to interpret data correctly.
I don’t know about for learning but definitely for collaborating and mentoring. And it’s difficult to make a definition of mastery that excludes both of those, so I suppose after a fashion it’s right.
Despite being a professed lover of math, I scored higher on the verbal than the math SAT. There’s a lot of persuasive and descriptive writing in software, particularly if you’re trying to build a team that works smarter instead of finding more corners to cut.
Math is VASTLY different with VASTLY more concepts that are all much more abstract in nature and harder to understand the infinite numbers of different ways one mathematical construct can be applied to another. A person can "master" coding, but no one ever masters math.
So comparing math to language or to coding is silly. They're completely separate domains. Just because each of the three can encode and represent the other two doesn't make them similar in any way whatsoever.
This is beyond silly from my perspective. I know the field of CS is vast, but this seems to conflate programming with CS. My school was more theory heavy but there definitely came a point in certain paths of study where I didnt touch a line of code for a year, just pure math. I struggle to even understand how someone can think of this sentence - computer science at its core is underpinned by mathematics.
"Coding largely involves the 'logical part' of your brain. It tends to not include the 'language part' of your brain.
This is one reason why comments you add to code are so useful: they force you to engage both parts of your brain and therefore get better results.
This is also why when you go to explain a problem to a colleague, you often have a flash of brilliance and solve the problem: you are reframing the problem using a different part of your brain and the different part of your brain helps solve the problem."
I'm sure some readers here will say this is preposterous and there is no such thing as having "two parts of your brain".
To them I suggest watching:
1. "You are are two" (about people with their corpus callosum being severed) https://www.youtube.com/watch?v=wfYbgdo8e-8
2. "Conscious Ants and Human Hives" by Peter Watts https://www.youtube.com/watch?v=v4uwaw_5Q3I
It wasn’t until high school, when I tested-into the highest math class that the school offered, that I began to unlock (with some initial struggle) more logical and procedural reasoning specific to mathematics that I had always done well in, but never explicitly went above-and-beyond in, despite hints of such in arithmetic competitions that my school would hold and that sort of thing. I just think my brain works well for both the linguistic aspects of programming (more naturally) and the computational problem-solving aspects of programming. Certainly there are individuals who have strengths in both cognitive aspects, despite being more naturally-attuned to one versus the other, at least presumably.
Perhaps this shows a cognitive profile that has natural strengths in both "brains", or maybe this highlights limitations of the article's potentially narrow definitions of "language" and "math", implying a more complex intellectual landscape.
Interesting findings nonetheless.
You absolutely should. Tiny sample size and poor statistical method. It is p-hacking plain and simple
People who end up being the best programmers have a deeper appreciation for semantics and information flow, but tend to commit more type II errors early on, making them inferior intro CS students.
Much of the CS curriculum (and typically also the required maths curriculum) in universities still favors the first type of student over the second, driving out the most capable and creative minds.
If you try programming and you don't like it chances are you won't be very good at it.
* It's a small sample, and they did not analyze the people who didn't complete the course. That's dubious. Those 6 could have had a massive influence on the outcome.
* The summary does not present the actual numbers. These are: "fluid reasoning and working-memory capacity explained 34% of the variance, followed by language aptitude (17%), resting-state EEG power in beta and low-gamma bands (10%), and numeracy (2%)". Note: numeracy, not math.
* The test result was only partially programming related. 50% consisted of the results of a multiple choice test with questions such as What does the “str()” method do?. Linguistic knowledge indeed.
* It's about completing a 7.5 hour Python course. That's learning indeed, but only the very beginning, where abstraction is not in play. The early phase is about welding bits of syntax into working order.
* The numeracy skills required are very low for such tasks, as the tasks are simple, and mainly require thinking in steps and loops, whereas numeracy aptitude is generally measured on rather problems involving fractions.
Edit: the paper uses the Rasch-Based Numeracy Scale for this, which seems to involve estimation and probabilities.
* 17% explained variance is a rather minimal result, and you cannot easily compare factors in such a small design, even if the other one is only 2%. That's a rather hairy statistical undertaking.
* Linguistic expedience might be explain the speed with which the course was followed, since the instruction is, obviously, linguistic. Hence, this factor is not necessarily related to the actual learning or programming.
* The argument from beta waves is clutching at straws.
* The argument that "perhaps women should have more of a reputation for being “good” at programming" because they score better on tests, is --however well meant-- utterly ridiculous. It reverses correlation to causation and then turns that into a fact.
* That linguistic skills are useful for programmers is widely understood. However, this is not because of the actual coding, but because the coder needs to understand the environment, the specs, the users, etc., all of which is transferred via language.
* And of course, the statistical result relies on Null Hypothesis Test Significance, which is rotten in its very foundations.
* Note that the CodeAcademy course "Learn Python 3" is 23 hours in 14 lessons.
Also, the article doesn't mention "math skills". It talks about numeracy, which is defined in a cited paper as "the ability to understand, manipulate, and use numerical information, including probabilities". This is only a very small part of mathematics. I would even argue that mathematics involves a lot of problem solving and since problem solving is a good predictor, math skills are good predictor.
Seeing as Codecademy lessons are written in English, I would think this may just be a result of participants with higher Language Aptitude being faster readers.
I do think that language skills are undervalued for programming, if only for their impact on your ability to read and write documentations or specifications, but I'm not sure this study is demonstrating that link in a meaningful way.
Well yes, my high school maths were in the high 90s - more than my language scores in French, German and Latin with some off curricular Russian. I guess being a polymath helps.
Unless you are doing an engineering or mathematical application you don't need much math, especially as you can just call a function in the vast majority of the time.
I did a number of software products and operating system modifications without using any math beyond arithmetic operations.
I was a resource for other programmers including the odd math PhD.
I learned to program when I was a kid and my maths skills were super basic. Programming can almost be distilled to something as basic as "if this, then do that", plus "do this x times". Then read API documentation and call functions that do what the docs say.
With just this basic understanding you can create a lot of stuff. The maths is obviously the foundation of computation, but to be a programming language user and build stuff, you don't actually need to understand most of it.
In university I did eventually do some math-y stuff (econ degree so prerequisites in stats, maths and even CS) and it helps with certain stuff (understanding graphics programming, ML and LLMs, plus knowing maths on its own is useful), but I still don't feel it was strictly necessary. Language and basic logic is enough IMO.
I feel the same way about starting learning programming. Repetition, repetition, repetition, until you "get good".
> All participants were right-handed native English speakers with no exposure to a second natural language before the age of 6 years
Which removes a confounder that Python mimics English syntax.
Still if this is a typical study recruiting thirty-some undergrads as subjects it's probably not generalizable, or even replicable given the same experimental setup.
I often tell people it's not that learning a language is hard, is learning that language's software library... and learning a software library doesn't feel like learning a language. More like learning a set of tools.
Programming is the manifestation of thought through the medium of a keyboard and screen. If you are a clear thinker, if you can hold multiple things in your head at once, if you can reason about things and their relations, well, you can be a strong programmer.
It seems wholly unremarkable to me that someone new to Python would not be fazed by it, given it's fundamental basis in words (print, if, etc.) Someone with a background in languages, who can think well enough to explicitly or implicitly pick up the structure of the language, is gonna do just fine. "Oh, so when I see a while, I need to end with a colon" isnt so different from "when I shout, I need to add a ! at the end"
(Java gets a special place in hell for forcing "public static void main" on every beginner.)
Math only really comes into it when you want to reason about things that have a history of being manipulated mathematically, typically for reasons of convenience. You could probably invert a matrix in SNOBOL, but its a lot easier to pull out lists and arrays and linear algebra.
In other words, lets see the follow up paper to this where Python newbies are asked to model the trajectory of a bead on a wire and see how they do.
Does answering a quiz on the contents of the first lesson on how to program in Python really encapsulate anything concrete about who will and will not be able to actually program in Python?
I've always been disturbed by the disconnect between "first lessons" on programming languages and how I personally actually learn programming languages. I can't help thinking that the researchers have measured something else other than whether people have learned to program.
But as a matter of practice, teaching programming to engineers/scientists, even to mathematicians, is an order of magnitude easier than teaching math to CS folks. Simply quiz job candidates on fp arithmetics, and see how many fail miserably.
Lately I’ve also felt language skills matter when writing concise, specific AI prompts. This has become a useful part of my programming workflow in, I suppose, the last year or so. Before that it was knowing “how to Google” well, but that’s less language-dependent in my opinion.
Probably the most valuable math classes for me, were ones that had me use algebra to solve word problems.
And, fundamentally, all languages describe the same stuff, using different tokens. That is pretty much in line with programming languages.
I believe the goal is to encourage those (young people) allergic to math but good in languages to realize they could be good at programming. That's worthy and important (though ironic to use a scientific study to do so).
As for the larger question commenters are raising, I notice often programmers reducing programming to problem-solving, and that to modeling, thence to math writ large; then they prove their point by noting that the most significant algorithms have strong and deep roots in math. The argument is: if the pinnacle of programming is math, then the more math-like programming is (in people and style), the better. (Hence: functional programming and provable algorithms are the epitome)
This argument is invariably made only by math experts, and does have the effect of reducing competition and selecting for like-minded persons, so there's at least the possibility of infection with self-interest and bias to the familiar.
I think the real distinction lies in this: with math, once you have a model, you fit things into that, and exclude other things. You do perfectly, but only in your domain. With language, any differance (any difference that makes a difference) starts to be named and tracked and dealt with somehow. It may grow in confusing ways, but it turns out (like architecture) it actually makes much more sense to adapt the structure to the use and relevance than vice-versa.
Sure, some subset of architecture is engineering by necessity, and it's probably even the hardest subset. But that's also the most portable knowledge and practice, and I would argue, thus easier to find and deploy once you've isolated such a need, particularly since mathy stuff fits nicely in a reusable library.
So math is an important but relatively isolated part of programming, and thinking it's ideal for everything is missing the point that programming matters for some purpose.
"attention is all you need" is but one example of the priority of relevance over structure in systems.
Computer science is much more than programming - and I think that most of the value derived is from being able to think about problems, which largely require the abstract type of thinking encouraged by more advanced math. Code is just a tool.
This classic article explains the real issue - like Mike Gancarz' classic on the Unix Philosophy, this is something all younger hackers should read, but few have, since these are the fundamental ideas that have created our modern world of computers: https://web.archive.org/web/20000529125023/http://www.wenet....
For example: "Matter determines consciousness." If we apply first-principles thinking—where does matter come from? This statement then becomes: "XXX created matter, and matter determines consciousness." At this point, our interest shifts to the first-principle subject "XXX," focusing our attention on it. Who is XXX? God?
In this thought process, we use the subject-predicate-object grammatical structure to trace back the original subject "matter" in "matter determines consciousness"—where does matter come from? Although this formal reasoning does not involve specific mathematical formulas, it indeed employs formal logic to uncover a flaw and opens the door to deeper rabbit-hole
It you ever need to get into the guts of a system or need to solve bleeding edge problems for which good abstractions don't yet exist, the "math brain" becomes significantly more relevant.
I say this as someone who studied literature and philosophy. The majority of what I know about programming and software engineering I either taught myself or learned from the tutelage of others on the job. Early on in my career, a solid mathematics background was, indeed, not that relevant. These days, though, I'd be lost without it. Whether you like it or not, when it comes to doing real engineering you necessarily need to establish bounds and prove things about those bounds and typically you'll need to do this numerically or at the very least using inductive structures. Linguistic aptitude is still relevant, but it helps less in these cases.
An interesting book that illustrates the evolving societal perception of mathematicians (and by extension, computer scientists) over time is “Duel at Dawn”. The modern (sic recent) notion is the reclusive genius (maybe even suggestion of autism) who possesses the alien, superhuman, unattainable, too-cool-for-you ability to process numbers and information that you can only be born with. (Those familiar with the TV show “the big bang theory” would recognize the trope of the “Sheldon Cooper” character.) This is False.
The reality is that no one is born with the super-human ability to do anything - anyone who is very good at something has worked very hard to get good at that thing regardless of the perception.
edit: my initial criticism of “the study” was based upon the article. On a skim of the actual cited paper, I revised my specific criticism, but the actual paper still comes off as no more than a mild eugenics argument dressed in psychology and statistics jargon.
To begin with, the study measures functional numeracy: the ability to solve everyday numerical problems. This is quite different from the kind of advanced mathematics often associated with programming, such as formal logic, symbolic abstraction, or the use of formal languages (as found in denotational semantics or type theory).
These more abstract skills—not basic arithmetic—are essential for understanding recursion, type inference, or algorithm design. That functional numeracy has low predictive power in this study does not imply that deep mathematical reasoning is irrelevant to programming.
Moreover, the language used in the study is Python, which was explicitly designed to be readable and semantically close to natural language. This may give an advantage to individuals with strong verbal skills, but the results don’t necessarily generalize to languages like C, Lisp, or Haskell, where symbolic and logical density is much higher.
Finally, language and mathematics are not opposing domains. They share cognitive underpinnings, such as working memory, executive attention, and hierarchical structure processing. The key is not which one "wins," but how they interact and complement each other in different programming contexts.
No they're not. Academia has spent decades trying to formalize many aspects of programming and continues to be confused by the lack of correlation between comp sci grads and innovative programmers. Why is it that the drop-outs are succeeding so wildly?
Recursion, for example, is learned by most of us real world achievers when we hit a brick wall in programming that other methods won't solve, and we have that aha moment of "this is why this exists". Not because we studied advanced math with symbolic abstraction, denotational semantics and type theory.
The uncomfortable truth is that almost all of professional programming and innovative programming (creating useful stuff never before seen) never uses any of the advanced math skills that are prerequisites in every degree program. I think much of the sadism around teaching this is perpetuated by "I did it so you have to" and academic gatekeeping.
When you get really really good at programming and hit the most productive zone in your life, it feels like language. That you have the ability to just say it.
Picking up a skill without intentional study is great, but you still learned the skill. Programming languages are formal languages. Most mathematicians don't study foundations either.
Professional programming doesn't often make use of specific advanced mathematical knowledge, but I find it makes everyday use of the skills.
Knuth created LaTeX. Pandoc is written in Haskell, famous for being a completely useless academic language with no real purpose beyond torturing undergraduates (it says here.) Efficient search and data compression algorithms aren't hacked together in late night hobby coding sessions.
Cryptography, digital signal processing for images, sound, and video, and ML core algorithms are all mathematical inventions. The digital world literally runs on them.
"Real world achievers" might want to try being a little less parochial and a little more educated about the originators of the concepts and environments they take for granted.
Vibe coding "Social AI chatbot network with ads = $$profit$$" or "Cat videos as a service" is only possible because the entire field stands on the shoulders of mathematical giants.
Like Rijndael?
I'd argue that if they can figure out recursion after hitting a brick wall like you describe, then that's a good indication they did have abstract math aptitude to begin with.
In my personal experience the hardest part of mathematics is it's grammar and language. It's very different from natural language, whereas programming is much closer. You can take nearly any math problem and convert it to pseudo code and it'll be much more understandable for those programmers who never studied (or struggled with) advanced math.
Programming requires a base level of natural language aptitude that nearly all adults have, there's diminishing returns for anything approaching the levels of a poet or novelist for example.
It can be an invariant in a programming function, it can be a more general result, if you can write a proof, it is mathematics. Most algorithms involve proofs, so they are mathematics.
It has nothing to do with it being "sadism" or academic gatekeeping.
These people are doing mathematics without knowing it is mathematics. That's all.
> Why is it that the drop-outs are succeeding so wildly?
Here is where you can learn about confirmation bias and educate yourself.
I've recently taken up a daily practice to improve my cognitive skills. Currently I'm using NeuroNation, and it is quite great. The stimulus to start the practice was getting my memory tested (clinically) and learning that it's below norm.
Tiny sample size - 36 people completed
Numeracy has R^2 = .27 Language has R^2 = .31
They then run stepwise regression to determine variance contributions, seemingly ignoring their earlier results, and this leads to almost no contribution from numeracy. Why? Because they have ~10% shared variance and stepwise regression is greedy - will just take whatever you give it first.
I can't mention this part enough. If you got a second, very similar language test, and added it to the model, you would _also_ find it has almost no unique variance added.
Every thing they measure is incredibly noisy and they do not once attempt to deal with this. Human based reviewers, time-to-completion, etc.
p-value for "language learning is more significant than numeracy" on the values they give (Steiger test) gives 0.772. Utterly insignificant.
Also, to beat the point home, just think about the argument here: * Numeracy contributes 27% of variance
* Language skills contribute 31% of variance
* After regression step, numeracy contributes only 2% of unique variance. Because you added a correlated variable!
(Ie, if you sample the same signal twice for the numbers they did in the study, there’s a 40% chance it’ll be >0.04 away from the original sample, as numeracy was from language)
>>> In this study, high levels of these beta oscillations were associated with faster learning and more programming knowledge
This makes me think those binaural beat programs attuned to beta wave frequency might help with heavy coding sessions?
Can anyone point to studies that confirm/reject this?
but really it just depends: - web or app information software: logic and language matters more here - doing what Donald Knuth does: math matters more here
- If you can write a formal proof, starting from some assumptions and proving some result, it is mathematics. The assumptions are the axioms. Then you use logic, which is the same programming logic, and then you get to some result.
It can be an invariant in a programming function, it can be a more general result, if you can write a proof, it is mathematics. Most algorithms involve proofs, so they are mathematics.
Qem•2mo ago
floxy•2mo ago
codr7•2mo ago
Jensson•2mo ago
Qem•2mo ago
codr7•2mo ago
graemep•2mo ago
jimbokun•2mo ago