At Traceoid (https://traceoid.ai) we are working on getting to AGI by looking at the problem through the lens of traces as sums over equivalence classes.
Join the Discord to learn more https://discord.com/invite/mr9TAhpyBW
In the case of the Fourier transform, it maps from time domain to frequency domain. In the frequency domain, we can see the amplitude (count) of the signal at each frequency.
Extraordinary claims require extraordinary evidence. Especially considering that a decent fraction of the CS/ML researchers that I know have solid physics and math backgrounds. Just of the top of my head, Marcus Hutter, David MacKay, Bernhard Scholkopf, Alex Smola, Max Welling, Christopher Bishop, etc. are/were prominent researchers with strong math and physics backgrounds. More recently Jared Kaplan and Dario Amodei at Anthropic also have physics backgrounds, as well as plenty of people at DeepMind.
To claim that you have noticed something in "100 years of physics and math research" that all of those people (and more) have missed and you didn't is pure hubris.
Cliche phrase is cliche. And yeah, no shit, we are working on it.
Re: your other points: cool, do we have scalable energy-based models?
To be clear: it’s a very good video and you should watch it if you don’t have a feel for the Fourier transform. I’m just trying to proactively instil a tiny bit of dissatisfaction with what you will know at the end of it, so that you will then go looking for more.
(Fractional fourier transform on the top face of the cube)
Salgat•1h ago
nomel•1h ago
And, with your own drawing: https://gofigure.impara.ai
esafak•1h ago
ajross•56m ago
CamperBob2•39m ago
femto•30m ago
The real world is somewhere in between. It must involve quantum mechanics (in a way I don't really understand), as maximum bandwidth/minimum wavelength bump up against limits such as the Planck length and virtual particles in a vacuum.
Blackthorn•8m ago
anvuong•16m ago
Essentially it's just projection in infinite-dimensional vector spaces.
incognito124•1h ago
Turns out... they are not! You can do the same thing using a different set of functions, like Legendre polynomials, or wavelets.
cjbgkagh•58m ago
MontyCarloHall•26m ago
Yup, any set of orthogonal functions! The special thing about sines is that they form an exceptionally easy-to-understand orthogonal basis, with a bunch of other nice properties to boot.
nestes•19m ago
Which to your point: You're absolutely correct that you can use a bunch of different sets of functions for your decomposition. Linear algebra just says that you might as well use the most convenient one!
kingstnap•20m ago
Like you can make any vector in R^3 `<x,y,z>` by adding together a linear combination of ` <1,0,0> `, ` <0,1,0> `, ` <0,0,1> `, turns out you can also do it using `<exp(j2pi0/30), exp(j2pi0/31), exp(j2pi0/32)>`, `<exp(j2pi1/30), exp(j2pi1/31), exp(j2pi1/32)>`, and `<exp(j2pi2/30), exp(j2pi2/31), exp(j2pi2/32)>`.
You can actually do it with a lot of different bases. You just need them to be linearly independent.
For the continuous case, it isn't all that different from how you can use a linear combination of polynomials 1,x,x^2,x^3,... to approximate functions (like Taylor series).