frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Optimizing PHP Apps in Dokku

https://aaron.com.es/blog/optimizing-php-apps-in-dokku/
1•chilipepperhott•1m ago•0 comments

PydanticPrompt: A simple library to document Pydantic models for LLMs

https://github.com/OpenAdaptAI/PydanticPrompt
1•abrichr•1m ago•0 comments

Show HN: Bottomless Storage for Agent Memory

https://www.acceleratedcloudstorage.com
1•obitoACS•6m ago•0 comments

"Do not highlight any negatives"

https://www.google.com/search?q=%22do+not+highlight+any+negatives%22+site%3Aarxiv.org
2•bgc•11m ago•0 comments

Centaur: A Controversial Leap Towards Simulating Human Cognition

https://insidescientific.com/centaur-a-controversial-leap-towards-simulating-human-cognition/
2•CharlesW•15m ago•0 comments

Russ Cox solves AoC 2021 Day 24 using Go in Acme [Compiler Analysis] [video]

https://www.youtube.com/watch?v=hmq6veCFo0Y
1•todsacerdoti•23m ago•0 comments

Intelligent Recommendation Engines with Agents

https://www.mlwhiz.com/p/genai-series-beyond-basic-rag-building
1•ai_unwrapped•25m ago•0 comments

Show HN: Zero-Prompt Video Generation

https://meme-gen.ai/create/animation
1•bd2025•25m ago•0 comments

Ukraine's Acoustic Detection System That Tracks Drones Cheap and Fast

https://united24media.com/war-in-ukraine/sky-fortress-ukraines-acoustic-detection-system-that-tracks-drones-cheap-and-fast-9451
2•vinnyglennon•25m ago•0 comments

The Broken Microsoft Pact: Layoffs and Performance Management

https://danielsada.tech/blog/microsoft-pact/
2•dshacker•33m ago•0 comments

FedEx founder saved company from bankruptcy with blackjack winnings (2014)

https://www.businessinsider.com/fedex-saved-from-bankruptcy-with-blackjack-winnings-2014-7
2•cebert•35m ago•0 comments

Ask HN: Is There a LeetCode for AI or Machine Learning?

2•NewUser76312•44m ago•0 comments

Self-supervised predictive learning accounts for cortical layer-specificity

https://www.nature.com/articles/s41467-025-61399-5
1•bookofjoe•46m ago•0 comments

Lac-Mégantic Rail Disaster (2013)

https://en.wikipedia.org/wiki/Lac-M%C3%A9gantic_rail_disaster
1•slyrus•56m ago•0 comments

Intel's Lion Cove P-Core and Gaming Workloads

https://chipsandcheese.com/p/intels-lion-cove-p-core-and-gaming
25•zdw•1h ago•0 comments

A non-anthropomorphized view of LLMs

http://addxorrol.blogspot.com/2025/07/a-non-anthropomorphized-view-of-llms.html
34•zdw•1h ago•5 comments

Battle of Vukovar: how 1,800 fighters held off a force of 36,000

https://en.wikipedia.org/wiki/Battle_of_Vukovar
3•felineflock•1h ago•0 comments

Derivative Eigenfunctions

https://www.ryantolsma.com/thoughts/2025/07/06/discrete-derivative.html
2•rtolsma•1h ago•1 comments

Ask HN: How do I buy a typewriter?

3•indus•1h ago•5 comments

Room to Think

https://remarkable.com/roomtothink
1•tmseidman•1h ago•0 comments

mTLS vs. HTTP Message Signatures: Tradeoffs in Securing HTTP Requests

1•getvictor•1h ago•1 comments

Nobody has a personality anymore: we are products with labels

https://www.freyaindia.co.uk/p/nobody-has-a-personality-anymore
12•drankl•1h ago•14 comments

Fines coming for Californians caught by drone with illegal fireworks

https://www.sfgate.com/bayarea/article/california-drones-illegal-fireworks-20629637.php
2•c420•1h ago•0 comments

Code and Trust: Vibrators to Pacemakers

https://punkx.org/jackdoe/code-and-trust.html
2•jackdoe•1h ago•0 comments

New Horizons images enable first test of interstellar navigation

https://www.newscientist.com/article/2486823-new-horizons-images-enable-first-test-of-interstellar-navigation/
1•jnord•1h ago•0 comments

Strategies to Better Resist Distractions

https://www.psychologytoday.com/us/blog/in-practice/202507/3-strategies-to-better-resist-distractions
1•exiguus•1h ago•0 comments

Trump's BBB has $85M to move space shuttle Discovery from Smithsonian to Texas

https://www.space.com/space-exploration/space-shuttle/trumps-signing-of-one-big-beautiful-bill-includes-usd85-million-to-move-space-shuttle-discovery-from-smithsonian-to-texas
8•zzzeek•1h ago•3 comments

The New Corporate Memo: Let AI Ease the Pain

https://gizmodo.com/the-new-corporate-memo-let-ai-ease-the-pain-2000624537
2•rntn•1h ago•0 comments

Record-Breaking Results Bring Fusion Power Closer to Reality

https://www.scientificamerican.com/article/record-breaking-results-bring-fusion-power-closer-to-reality/
2•saubeidl•1h ago•0 comments

iOS app using color filter manipulation

1•camputer_•1h ago•0 comments
Open in hackernews

Functions Are Vectors (2023)

https://thenumb.at/Functions-are-Vectors/
141•azeemba•8h ago

Comments

pvg•8h ago
Discussion at the time https://news.ycombinator.com/item?id=36921446
nyrikki•7h ago
The one place that I think the previous discussion lost something important, at least to me with functions.

The popular lens is the porcupine concept when infinite dimensions for functions is often more effective when thought of as around 8:00 in this video.

https://youtu.be/q8gng_2gn70

While that video obviously is not fancy, it will help with building an intuition about fixed points.

Explaining how the dimensions are points needed to describe a functions in a plane and not as much about orthogonal dimensions.

Specifically with fixed points and non-expansive mappings.

Hopefully this helps someone build intuitions.

olddustytrail•7h ago
> infinite dimensions for functions is often more effective when thought of as around 8:00

I guess it works if you look at it sideways.

chongli•6h ago
I see this a lot with math concepts as they begin to get more abstract: strange visualizations to try to build intuition. I think this is ultimately a dead-end approach which misleads rather than enlightens.

To me, the proper way of continuing to develop intuition is to abandon visualization entirely and start thinking about the math in a linguistic mode. Thus, continuous functions (perhaps on the closed interval [0,1] for example) are vectors precisely because this space of functions meet the criteria for a vector space:

* (+) vector addition where adding two continuous functions on a domain yields another continuous function on that domain

* (.) scalar multiplication where multiplying a continuous function by a real number yields another continuous function with the same domain

* (0) the existence of the zero vector which is simply the function that maps its entire domain of [0,1] to 0 (and we can easily verify that this function is continuous)

We can further verify the other properties of this vector space which are:

* associativity of vector addition

* commutativity of vector addition

* identity element for vector addition (just the zero vector)

* additive inverse elements (just multiply f by -1 to get -f)

* compatibility of scalar multiplication with field multiplication (i.e a(bf) = (ab)f, where a and b are real numbers and f is a function)

* identity element for scalar multiplication (just the number 1)

* distributivity of scalar multiplication over vector addition (so a(f + g) = af + ag)

* distributivity of scalar multiplication over scalar addition (so (a + b)f = af + bf)

So in other words, instead of trying to visualize an infinite-dimensional space, we’re just doing high school algebra with which we should already be familiar. We’re just manipulating symbols on paper and seeing how far the rules take us. This approach can take us much further when we continue on to the ideas of normed vector spaces (abstracting the idea of length), sequences of vectors (a sequence of functions), and Banach spaces (giving us convergence and the existence of limits of sequences of functions).

ajkjk•5h ago
Funny, I agree that visualizations aren't that useful after a point, but when you said "start thinking about the math in a linguistic mode" I thought you were going to describe what I do, but then you described an entirely different thing! I can't learn math the way you described at all: when things are described by definitions, my eyes glaze over, and nothing is retained. I think the way you are describing filters out a large percentage of people who would enjoy knowing the concepts, leaving only the people whose minds work in that certain way, a fairly small subset of the interested population.

My third way is that I learn math by learning to "talk" in the concepts, which is I think much more common in physics than pure mathematics (and I gravitated to physics because I loved math but can't stand learning it the way math classes wanted me to). For example, thinking of functions as vectors went kinda like this:

* first I learned about vectors in physics and multivariable calculus, where they were arrows in space

* at some point in a differential equations class (while calculating inner products of orthogonal hermite polynomials, iirc) I realized that integrals were like giant dot products of infinite-dimensional vectors, and I was annoyed that nobody had just told me that because I would have gotten it instantly.

* then I had to repair my understanding of the word "vector" (and grumble about the people who had overloaded it). I began to think of vectors as the N=3 case and functions as the N=infinity case of the same concept. Around this time I also learned quantum mechanics where thinking about a list of binary values as a vector ( |000> + |001> + |010> + etc, for example) was common, which made this easier. It also helped that in mechanics we created larger vectors out of tuples of smaller ones: spatial vector always has N=3 dimensions, a pair of spatial vectors is a single 2N = 6-dimensional vector (albeit with different properties under transformations), and that is much easier to think about than a single vector in R^6. It was also easy to compare it to programming, where there was little difference between an array with 3 elements, an array with 100 elements, and a function that computed a value on every positive integer on request.

* once this is the case, the Fourier transform, Laplace transform, etc are trivial consequences of the model. Give me a basis of orthogonal functions and of course I'll write a function in that basis, no problem, no proofs necessary. I'm vaguely aware there are analytic limitations on when it works but they seem like failures of the formalism, not failures of the technique (as evidenced by how most of them fall away when you switch to doing everything on distributions).

* eventually I learned some differential geometry and Lie theory and learned that addition is actually a pretty weird concept; in most geometries you can't "add" vectors that are far apart; only things that are locally linear can be added. So I had to repair my intuition again: a vector is a local linearization of something that might be macroscopically, and the linearity is what makes it possible to add and scalar-multiply it. And also that there is functionally no difference between composing vectors with addition or multiplication, they're just notations.

At no point in this were the axioms of vector spaces (or normed vector spaces, Banach spaces, etc) useful at all for understanding. I still find them completely unhelpful and would love to read books on higher mathematics that omit all of the axiomatizations in favor of intuition. Unfortunately the more advanced the mathematics, the more formalized the texts on it get, which makes me very sad. It seems very clear that there are two (or more) distinct ways of thinking that are at odds here; the mathematical tradition heavily favors one (especially since Bourbaki, in my impression) and physics is where everyone who can't stand it ends up.

chongli•5h ago
I can't learn math the way you described at all: when things are described by definitions, my eyes glaze over, and nothing is retained. I think the way you are describing filters out a large percentage of people who would enjoy knowing the concepts, leaving only the people whose minds work in that certain way, a fairly small subset of the interested population.

If you told me this in the first year of my math degree I would have included myself in that group. I think you’re right that a lot of people are filtered out by higher math’s focus on definitions and theorems, although I think there’s an argument to be made that many people filter themselves out before really giving themselves the chance to learn it. It took me another year or two to begin to get comfortable working that way. Then at some point it started to click.

I think it’s similar to learning to program. When I’m trying to write a proof, I think of the definitions and theorems as my standard library. I look at the conclusion of the theorem to prove as the result I need to obtain and then think about how to build it using my library.

So for me it’s a linguistic approach but not a natural language one. It’s like a programming language and the proofs are programs. Believe it or not, this isn’t a hand-wavey concept either, it’s a rigorous one [1].

[1] https://en.wikipedia.org/wiki/Curry%E2%80%93Howard_correspon...

Tainnor•4h ago
> When I’m trying to write a proof, I think of the definitions and theorems as my standard library. I look at the conclusion of the theorem to prove as the result I need to obtain and then think about how to build it using my library.

fwiw, this is exactly the thing that you when you're trying to formally prove some theorem in a language like Lean.

chongli•3h ago
I do want to learn theorem proving in Lean just for a hobby at some point. I haven't found a great resource for it though.
Tainnor•1h ago
Have you seen: https://leanprover-community.github.io/mathematics_in_lean/
chongli•38m ago
I hadn’t seen that. Thanks!
MalbertKerman•5h ago
> and I was annoyed that nobody had just told me that because I would have gotten it instantly.

Right?! In my path through the physics curriculum, this whole area was presented in one of two ways. It went straight from "You don't need to worry about the details of this yet, so we'll just present a few conclusions that you will take on faith for now" to "You've already deeply and thoroughly learned the details of this, so we trust that you can trivially extend it to new problems." More time in the math department would have been awfully useful, but somehow that was never suggested by the prerequisites or advisors.

ajkjk•4h ago
oh, my point was the opposite of that. The math department was totally useless for learning how anything made sense. I only understood linear algebra when I took quantum mechanics for instance. The math department couldn't be bothered to explain anything in any sort of useful way; you were supposed to prove pointless theorems about things you didn't understand.
MalbertKerman•2h ago
I did get a lot of that in the lower level math courses, where it kinda felt like the math faculty were grudgingly letting in the unwashed masses to learn some primitive skills to apply [spit] to their various fields, and didn't really give a shit if anybody understood anything as long as the morons could repeat some rituals for moving x around on the page. I didn't really understand integrals until the intermediate classical mechanics prof took an hour or two to explain what the hell we had been doing for three semesters of calculus.

But when I did go past the required courses and into math for math majors, things got a lot better. I just didn't find that out until I was about to graduate.

Tainnor•31m ago
> So I had to repair my intuition again: a vector is a local linearization of something that might be macroscopically, and the linearity is what makes it possible to add and scalar-multiply it. And also that there is functionally no difference between composing vectors with addition or multiplication, they're just notations.

Except none of this is true of vectors in general, although it might be true of very specific vector spaces in physics that you may have looked at. Matrices or continuous functions form vector spaces where you can add any vectors, no matter how far apart. Maybe what you're referring to is that differentiability allows us to locally approximate nonlinear problems with linear methods but that doesn't mean that other things aren't globally linear.

I also don't understand what you mean by "no difference between composing vectors with addition or multiplication", there's obviously a difference between adding and multiplying functions, for example (and vector spaces in which you can also multiply are another interesting structure called an algebra).

That's the problem if you just go from intuition to intuition without caring about the formalism. You may end up with the wrong understanding.

Intuition is good when guided by rigour. Terence Tao has written about this: https://terrytao.wordpress.com/career-advice/theres-more-to-...

The vector space axioms in the end are nothing more than saying: here's a set of objects that you can add and scale and here's a set of rules that makes sure these operations behave like they're supposed to.

tsimionescu•3h ago
> I see this a lot with math concepts as they begin to get more abstract: strange visualizations to try to build intuition. I think this is ultimately a dead-end approach which misleads rather than enlightens.

Isn't this how people arrived at most of these concepts historically, how the intuition arose that these are meaningful concepts at all?

For example, the notion of a continuous function arose from a desire to explicitly classify functions whose graph "looks smooth and unbroken". People started with the visual representation, and then started to build a formalism that explains it. Once they found a formalism that was satisfying for regular cases, they could now apply it to cases where the visual intuition fails, such as functions on infinite-dimensional spaces. But the concept of a continuous function remains tied to the visual idea, fundamentally that's where it comes from.

Similalrly with vectors, you have to first develop an intuition of the visual representation of what vector operations mean in a simple to understand vector space like Newtonian two-dimensional or three-dimensional space. Only after you build this clean and visual intuition can you really start understanding the formalization of vectors, and then start extending the same concepts to spaces that are much harder or impossible to visualize. But that doesn't mean that vector addition is an arbitrary operation labeled + - vector addition is a meaningful concept for spatial vectors, one that you can formally extend to other operations if they follow certain rules while retaining many properties of the two-dimensional case.

Scene_Cast2•7h ago
Same thing in video form explained by a different person - https://youtu.be/mhEFJr5qvLo
malwrar•7h ago
So cool! This is the first time I’ve ever read about a math idea and felt a deep pull to know more.
skybrian•7h ago
It seems like mentioning some of the applications at the beginning would motivate learning all these definitions.
almostgotcaught•7h ago
> "The material is not motivated." Not motivated? Judas just stick a dagger in my heart. This material needs no motivation. Just do it. Faith will come. He's teaching you analysis. Not selling you a used car. By the time you are ready to read this book you should not need motivation from the author as to why you need to know analysis. You should just feel a burning in you chest that can only be quenched by arguments involving an arbitrary sequence {x_n} that converges to x in X.

https://www.amazon.com/review/R23MC2PCAJYHCB

skybrian•7h ago
Not sure what I'm supposed to get from that. I guess some people care a little too much about math and have trouble relating to others?
almostgotcaught•6h ago
you're supposed to get that the cynical lens you're applying here doesn't fit - if you aren't intrinsically motivated to read this stuff then it's not for you. which is fine btw because (functional) analysis isn't a required class.
TheRealPomax•6h ago
If you need "practical applications" for some part of math to have value to you, then large parts of math will not be for you. That's fine, but that's also something you should accept and internalize: math is already its own application, we dig through it in order to better understand it, and that understanding will (with rather advanced higher education) be applicable to other fields, which in turn may have practical uses.

Those practical uses are someone else's problem to solve (even if they rely on math to solve them), and they can write their own web pages on how functions as vectors help solve specific problems in a way that's more insightful than using "traditional" calculus, and get those upvoted on HN.

But this link has a "you must be this math to ride" gate, it's not for everyone, and that's fine. It's a world wide web, there's room for all levels of information. You need to already appreciate the problems that you encountered in non-trivial calculus to appreciate this interpretation of what a function even is and how to exploit the new power that gives you.

skybrian•6h ago
I don't see any such "math gate" on this link. Also, this math does have practical applications, but they're not mentioned until very late in the article.

My suggestion is that briefly mentioning them up front might be nice. I didn't mean to start a big argument about it.

almostgotcaught•6h ago
i'll never fathom why people on hn treat a post as an auto-invite for unsolicited feedback.
LegionMammal978•6h ago
Yet some parts of math are 'preferred' over others, in that most 'serious' mathematicians would rather read 100 pages about functional analysis than 100 pages of meandering definitions from some rando trying to solve the Collatz conjecture.

Some people would like to have a filter for what to spend their time on, better than "your elders before you have deemed these ideas deeply important". One such filter is "Can these ideas tell us nontrivial things about other areas of math?" That is, "Do they have applications?"

Short of the strawman of immediate economic value, I don't think it's wrong to view a subject with light skepticism if it seemingly ventures off into its own ivory tower without relating back to anything else. A few well-designed examples can defuse this skepticism.

ethan_smith•6h ago
This perspective is crucial for understanding signal processing, machine learning, and quantum mechanics. Viewing functions as vectors enables practical techniques like Fourier transforms and kernel methods that underlie many modern technologies.
sixo•5h ago
The genre of this article is not pedagogical, really. One usually learns these techniques in the course of a particular field like physics, electrical engineering, or theoretical chemistry. This article is best thought of as "a story you've seen before, but told from the beginning / ground up, with a lot of the connections to other topics and examples laid out for you". For that purpose, it's excellent, perhaps the best I've ever seen. It might also whet the appetite of a novice, but it's not really for that.
gizmo686•5h ago
The first paragraph and table of context both mention applications.
skybrian•5h ago
Yes, so it does. Perhaps I read too quickly.
tempodox•6h ago
Oh, my. Alice, meet rabbit hole.
MalbertKerman•6h ago
The jump from spherical harmonics to eigenfunctions on a general mesh, and the specific example mesh chosen, might be the finest mathematical joke I've seen this decade.
sixo•6h ago
Would you explain the joke for the rest of us?
xeonmc•5h ago
Spherical Haromics approximating Spherical Cows?
dark__paladin•5h ago
assume spherical cow
MalbertKerman•5h ago
It's quietly reversing the traditional "We approximate the cow to be a sphere" and showing how the spherical math can, in fact, be generalized to solutions on the cow.
sixo•5h ago
oh. I did not interpret that blob as a cow. Thanks.
a3w•6h ago
Nice: the variable l and m values can allow you to get orbitals from chemistry.

(This is where I learned at least half of the math on this page: theoretical chemistry.)

xeonmc•5h ago
also known as Applied Quantum Mechanics.
ttoinou•5h ago
Isn't this the opposite way ? Vectors are functions whose input space are discrete dimensions. Let's not pretend going from natural numbers to real is "simple", reals numbers are a fascinating non-obvious math discovery. And also the passage from a few numbers to all natural numbers (aleph0) is non obvious. So basically we have two alephs passages to transforms N-D vectors as functions over reals.
xeonmc•5h ago
Vectors are not necessarily discrete-domained. Anything that satisfies the vector space properties is a vector.
ttoinou•5h ago
I agree but I'm operating under the assumption of the article

  Conceptualizing functions as infinite-dimensional vectors lets us apply the tools of linear algebra to a vast landscape of new problems
layer8•5h ago
Linear algebra isn’t limited to discrete-dimensional vector spaces. Or what do you mean?
ttoinou•3h ago
See my other comment sibling.

And he's starting from the assumption vectors are finite (cf. the article)

Sharlin•1h ago
He does not assume anything! Any assumption is in your head only. Of course he starts from the specific type of vector spaces that's the most familiar to readers. But then he shows that there's nothing that requires a vector space to have a finite, or even countably infinite, dimension. What matters are the axioms.
gizmo686•5h ago
Vectors are an abstract notion. If you have two sets and two operations that satisfy the definition of a vector space, then you have a vector space; and we refer to elements of the vector set as "vectors" within that vector space.

The observation here is that set of real value functions, combined with the set of real numbers, and the natural notion of function addition and multiplication by a real number satisfies the definition of a vector space. As a result all the results of linear algebra can be applied to real valued functions.

It is true that any vector space is isomorphic to a vector space whose vectors are functions. Linear algebra does make a lot of usage of that result, but it is different from what the article is discussing.

ttoinou•3h ago
I agree but we're using functions for different things here. Yes some specific families of functions can be treated as vector spaces. In this article it seems like the author is pretending to take all real->real functions and treating them as if they are a vector space, whatever the content of the functions, quote :

  we’ve built a vector space of functions
and later he admits it is impossible

  Ideally, we could express an arbitrary function f as a linear combination of these basis functions. However, there are uncountably many of them—and we can’t simply write down a sum over the reals. Still, considering their linear combination is illustrative:

They are uncountable because they are aleph1
998244353•2h ago
The set of all real->real functions is still a vector space.

This vector space also has a basis (even if it is not as useful): there is a (uncountably infinite) subset of real->real functions such that every function can be expressed as a linear combination of a finite number of these basis functions, in exactly one way.

There isn't a clean way to write down this basis, though, as you need to use Zorn's lemma or equivalent to construct it.

ttoinou•2h ago
I'd love to read more about that, he's not talking about that at all in this article though
gizmo686•2h ago
It is not required for vector spaces to have a basis. As it turns out, the claim that every vector space has a basis is equivalent to the axiom of choice, which seems well beyond the scope of the article.

However, the particular vector space in question (functions from R to R) does have a basis, which the author describes. That basis is not as useful as a basis typically is for finite dimensional (or even countably unfitine dimensional) vector spaces, but it still exists.

ttoinou•1h ago
But it's the article talking about vectors as a sequence of reals and having a basis, then extending that to infinite sequences of reals. The author is playing on multiple definitions of vector to produce a "woaw that's cool" effect, and that's bad maths
Sharlin•1h ago
There is only one definition of "vector space" (up to isomorphism anyway), and that's what the author uses. You'll note that he doesn't talk about bases at all, the assumption of a basis is entirely in your mind. The entire point of the article is that the ℝ→ℝ function space is a vector space. A vector space is not required to have a basis, but assuming the axiom of choice, every vector space does have (at least) one, including that of ℝ→ℝ functions.
sixo•5h ago
A few questions occur to me while reading this, which I am far from qualified to answer:

- How much of this structure survives if you work on "fuzzy" real numbers? Can you make it work? Where I don't necessarily mean "fuzzy" in the specific technical sense, but in any sense in which a number is defined only up to a margin of error/length scale, which in my mind is similar to "finitism", or "automatic differentiation" in ML, or a "UV cutoff" in physics. I imagine the exact definition will determine how much vectorial structure survives. The obvious answer is that it works like a regular Fourier transform but with a low-pass filter applied, but I imagine this might not be the only answer.

- Then if this is possible, can you carry it across the analogy in the other direction? What would be the equivalent of "fuzzy vectors"?

- If it isn't possible, what similar construction on the fuzzy numbers would get you to the obvious endpoint of a "fourier analysis with a low pass filter pre-applied?"

- The argument arrives at fourier analysis by considering an orthonormal diagonalization of the Laplacian. In linear algebra, SVD applies more generally than diagonalizations—is there an "SVD" for functions?

xeonmc•5h ago
I’d guess that it would be factored as “nonlinearity”, which might be characterized as some form of harmonic distortion, analogous to clipping nonlinearity of finite-ranged systems?

Perhaps some conjugate relation could be established between finite-range in one domain and finite-resolution in another, in terms of the effect such nonlinearities have on the spectral response.

sitkack•4h ago
A fuzzy vector is a Gaussian? Thinking of what it would be in 1, 2, 3 and n dimensions.
sfpotter•3h ago
1. Numerical methods for solving differential and integral equations are algorithms for solving algebraic equations (vector solutions) that arise from discretizing infinite-dimensional operator equations (function solutions). When we talk about whether these methods work, we usually do so in terms of their consistency and stability. There is a multistage things that happens here: we start by talking about the well-posedness of the original equation (e.g. the PDE), then the convergence of the mathematical discretization, and then examine what happens when we try to program this thing on a computer. Usually what happens is these algorithms will get implemented "on top" of numerical linear algebra, where algorithms like Gaussian elimination, and different iterative solvers, have been studied very carefully from the perspective of floating point rounding errors etc. This kind of subsumes your concern about "fuzzy" real numbers. Remember that in double precision, if the number "1.0" represents "1 meter", then machin epsilon is atomic scale. So, frequently, you can kind of assume the whole process "just works"...

2/3. I'm not really sure what you mean by these questions... But if you want to do "fourier analysis with a filter preapplied", you'd probably just work with within some space of bandlimited functions. If you only care around N Fourier modes, any time you do an operation which exceeds that number of modes, you need to chop the result back to down to size.

4. In this context, it's really the SVD of an operator you're interested in. In that regard, you can consider trying to extend the various definitions of the SVD to your operator, provided that you carefully think about all spaces involved. I assume at least one "operator SVD" exists and has been studied extensively... For instance, I can imagine trying to extend the variational definition of the SVD... and the algorithms for computing the SVD probably make good sense in a function space, too...

woopsn•2h ago
Convolution with dirac delta will give you an exact sample of f(0), and in principle a whole signal could be constructed as a combination of delayed delta signals - but we can't realize an exact delta signal in most spaces, only approximations.

As a result we get finite resolution and truncation of the spectrum. So "Fourier analysis with pre-applied lowpass filter" would be analysis of sampled signals, the filter determined by the sampling kernel (delta approximator) and properties of the DFT.

But so long as the sampling kernel is good (that is the actual terminology), we can form f exactly as the limit of these fuzzy interpolations.

The term "resolution of the identity" is associated with the fact that delta doesn't exist in most function spaces and instead has to be approximated. A good sampling kernel "resolves" the missing (convolutional) identity. I like thinking of the term also in the sense that these operators behave like the identity if it were only good up to some resolution.

gizmo686•1h ago
Your can replace the real numbers with the rational numbers and maintain all of the vector structure.

If you wanted something more quantized, you can pick some length unit, d, and replace the real numbers with {... -2d, -d, 0, d, 2d,... }. This forms a structure known as a "ring" with the standard notion of addition, subtraction, and multiplication (but no notion of division. Using this instead of R does lose the vector structure, but is still an example of a slightly more general notion of a "module". Many of the linear algebra results for vector spaces apply to modules as well.

> If it isn't possible, what similar construction on the fuzzy numbers would get you to the obvious endpoint of a "fourier analysis with a low pass filter pre-applied?"

If that is where you want to end up, you could pretty much start there. If you take all real value functions and apply a courier analysis with a low pass filter to each of them, the resulting set still forms a vector space. Although I don't see any particular way of arriving at this vector space by manipulating functions pre Fourier transform.

simpaticoder•5h ago
The author asserts vectors are functions, specifically a function that takes an index and returns a value. He notes that as you increase the number of indices, a vector can contain an arbitary function (he focuses on continuous, real-valued functions).

It's fun to simulate one thing with another, but there is a deeper and more profound sense in which vectors are functions in Clifford Algebra, or Geometric Algebra. In that system, vectors (and bi-vectors...k-vectors) are themselves meaningful operators on other k-vectors. Even better, the entire system generalizes to n-dimensions, and decribes complex numbers, 2-d vectors, quaternions, and more, essentially for free. (Interestingly, the primary operation in GA is "reflection", the same operation you get in quantum computing with the Hadamard gate)

layer8•5h ago
Well, yeah, function spaces are an example of vector spaces: https://en.wikipedia.org/wiki/Vector_space#Function_spaces
dang•5h ago
This previous thread was also good: Functions are vectors - https://news.ycombinator.com/item?id=36921446 - July 2023 (120 comments)
EGreg•5h ago
Only functions on a finite domain are vectors.

Functions on a countable domain are sequences.

ttoinou•2h ago
Why is this being downvoted ? Could a downvoter elaborate ?
teiferer•2h ago
Because it makes little sense.

Vector spaces can have infinite dimension, so the "only" in the first sentence does not belong there.

The second sentence is also odd. How do you define "sequence"? Are there no finite sequences?

ttoinou•2h ago
I think it is "vector" taken in the way the author wrote about it / showed illustrations in the article.

For the second sentence, he's right, we could also write (wrongly) an article titled "Functions are Sequences" and (try to) apply what we know about dealing with countable sequences to functions

jschveibinz•4h ago
An engineering, signal processing extension/perspective:

An infinite sequence approximates a general function, as described in the article (see the slider bar example). In signal processing applications, functions can be considered (or forced) to be bandlimited so a much lower-order representation (i.e. vector) suffices:

- The subspace of bandlimited functions is much smaller than the full L^2 space - It has a countable orthonormal basis (e.g., shifted sinc functions) - The function can be written as (with sinc functions):

x(t) = \sum_{n=-\infty}^{\infty} f(nT) \cdot \text{sinc}\left( \frac{t - nT}{T} \right)

- This is analogous to expressing a vector in a finite-dimensional subspace using a basis (e.g. sinc)

Discrete-time signal processing is useful for comp-sci applications like audio, SDR, trading data, etc.

QuesnayJr•4h ago
Full $L_2$ also has a countable orthonormal basis. Hermite functions are one example.
77pt77•4h ago
Any basic liniear algebra course should talk about this, at least in the finite dimensional case.

Polynomials come to mind.

ttoinou•2h ago
Finite degree polynomials are vectors yes. Polynomials is a typical example you study when learning about linear algebra. Doesn't say anything about real functions in general though, I don't think any linear algebra course should make the analogies made in this article, that'd be confusing
mouse_•3h ago
I love the prerequisites section. Every technical blog post should start with this.
bmitc•3h ago
I will need to read through the rest of the article later, but the initial intuition building is a bit sloppy. None of those vectors drawn in the initial examples belong to the same vector space. Vectors need to emanate from the same origin to be considered as part of the same vector space.
ttoinou•2h ago
The author seems to be a great educator and computer scientist, much respect to his work. But from what I can gather, although I'd love to study more infinite sized matrices, he proved / showed nothing in this article. What he wrote is not true at all, they are only analogies and not rigorous maths. Functions are not vectors. But finite polynomials are vectors yes, this is trivial.
gizmo686•2h ago
https://thenumb.at/Functions-are-Vectors/#proofs

It's not a particularly interesting proof, but the author does prove that real valued functions are vectors. The bulk of the article is less about proofs, and more about showing how the above result is useful.

ttoinou•1h ago
Vectors in the way he talks about in the beginning. With indices (and then extending to "In higher dimensions, vectors start to look more like functions!"). Of course if you use the general meaning of every word, vectors are functions and functions are vectors, and this article shouldn't then have anything interesting to talk about.

  how the above result is useful
It doesn't seem useful at all to me, the examples in the article are not that interesting. On the contrary it is more confusing than anything to apply linear algebra to real valued functions.