frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Show HN: A searchable frontend for HN Who is Hiring that runs in your browser

https://dheerajck.github.io/hnwhoishiring/
1•DHEERAJCK•2m ago•0 comments

Show HN: Datajoi – AI-SQL Client in the Browser

https://www.datajoi.com/
1•datajoi•3m ago•0 comments

AI Malware Is Here: New Report Shows How Fake AI Tools Are Spreading Ransomware

https://blog.talosintelligence.com/fake-ai-tool-installers/
1•karlperera•3m ago•1 comments

Xmonad: A dynamically tiling window manager that is written in Haskell

https://github.com/xmonad/xmonad
1•behnamoh•4m ago•0 comments

Secrets of The Gobi Wall Revealed

https://archaeology.org/news/2025/05/30/secrets-of-the-gobi-wall-revealed/
1•bookofjoe•5m ago•0 comments

Rapid Fullstack Development

https://rapidfullstackdevelopment.com/chapter-1
1•mefengl•5m ago•0 comments

Palace: 3D Finite Element Solver for Computational Electromagnetics

https://awslabs.github.io/palace/stable/
1•areoform•7m ago•0 comments

Show HN: SupaWP – Supabase WordPress Integration

https://dalenguyen.me/blog/2025-04-19-supabase-wordpress-integration-supawp-plugin
1•dalenguyen•9m ago•1 comments

Racing the Beam Explained - Atari 2600 CPU vs. CRT Television [video] (2021)

https://www.youtube.com/watch?v=sJFnWZH5FXc
1•Bluestein•9m ago•0 comments

I predicted AI journalism 10 years ago – and I got it wrong

https://opeonikute.dev/posts/ai-journalism
1•opeonikute•10m ago•0 comments

Ask HN: How Are Parents Who Program Teaching Their Kids Today?

1•laze00•11m ago•1 comments

Show HN: Viidure Dash Cam App

https://viidure.app
1•kevinhacker•12m ago•0 comments

Waydroid: Boot a full Android system on a regular GNU/Linux system

https://github.com/waydroid/waydroid
1•thunderbong•14m ago•0 comments

Understanding Inequality, Part I

https://paulkrugman.substack.com/p/understanding-inequality-part-i
1•rbanffy•20m ago•0 comments

Show HN: CXcompress is a lossless text/binary-code compressor

https://github.com/ZetaCrush/CXcompress
1•s3cfast•20m ago•1 comments

Type of Fiber Could Have Weight Loss Benefits Similar to Ozempic

https://www.sciencealert.com/this-type-of-fiber-could-have-weight-loss-benefits-similar-to-ozempic
1•amichail•20m ago•0 comments

The Coming Age of Wisdom Work

https://every.to/context-window/the-coming-age-of-wisdom-work?ph_email=rbanffy%40gmail.com
1•rbanffy•21m ago•1 comments

Logical Fallacies

https://owl.purdue.edu/owl/general_writing/academic_writing/logic_in_argumentative_writing/fallacies.html
1•a_w•21m ago•0 comments

The AI Question: How Do We Build Systems That Require Less Code?

https://alonso.network/systems-that-require-less-code/
1•daniloa•22m ago•1 comments

Solar Storms Are Pushing Elon Musk's Satellites Back to Earth

https://gizmodo.com/solar-storms-are-pushing-elon-musks-satellites-back-to-earth-2000608452
3•amichail•25m ago•0 comments

Real Estate AI App looking for feedback

1•llermaly•31m ago•0 comments

Print CHR$(205.5+RND(1));: Goto 10

https://10print.org/
1•abhi9u•31m ago•1 comments

Show HN: macOS app to keep task-related content in always-visible desktop spaces

https://www.mijick.com/
1•mijick•33m ago•0 comments

Builder.ai Faked Business with Indian Firm VerSe to Inflate Sales

https://www.bloomberg.com/news/articles/2025-05-30/builder-ai-faked-business-with-indian-firm-verse-to-inflate-sales-sources-say
1•gmays•33m ago•0 comments

How Dot Files Became Hidden Files

https://glenda.0x46.net/articles/dotfiles/
2•wietze•34m ago•0 comments

Electronics from Peanut Shells

https://physics.aps.org/articles/v18/112
2•rhollos•35m ago•1 comments

Boxtype – A Puzzle Game

https://inconvergent.net/2025/boxtype/
3•surprisetalk•35m ago•0 comments

Arcade Rhythm Games

https://cadence.moe/blog/2025-05-18-arcade-rhythm-games
1•surprisetalk•36m ago•0 comments

Hacked

https://sheep.horse/2025/5/hacked%21.html
1•surprisetalk•36m ago•0 comments

Darwin Gödel Machine

https://gonzoml.substack.com/p/darwin-godel-machine
1•che_shr_cat•37m ago•0 comments
Open in hackernews

Gradients Are the New Intervals

https://www.mattkeeter.com/blog/2025-05-14-gradients/
141•surprisetalk•1d ago

Comments

kragen•1d ago
This is very exciting! It seems like a lot of the interval stuff is bringing to fruition my idle speculations from 6 years ago in https://dercuano.github.io/notes/interval-raymarching.html. I'll have to read this carefully to see if it's actually the same approach or a different one that uses the same algebras.
mkeeter•1d ago
Interesting post, thanks for the link!

You may also enjoy "Spelunking the Deep: Guaranteed Queries on General Neural Implicit Surfaces via Range Analysis", which uses interval arithmetic (ish) to raymarch neural implicit surfaces:

https://arxiv.org/abs/2202.02444

kragen•1d ago
That's fantastic, thanks! I didn't know about neural implicit surfaces at all.
fph•1d ago
One of the benefits of intervals is that you can ensure your results are correct irrespective of floating-point errors, if you carefully round all computations in the correct direction.

I don't think you can ensure that with gradients though: if f and f' are computed in machine arithmetic, cancellation errors might pile up.

kragen•1d ago
It's true; this is a different application of interval arithmetic, not the usual application, which is, as you say, to avoid numerical stability issues. By and large, graphics researchers don't give a shit about numerical stability issues. They're the Ed Woods of numerical computation.
sfpotter•19h ago
The point of interval arithmetic isn't really to deal with numerical stability, per se... it's to make it possible to do rigorous and validated computations on a computer. Numerical stability is a different concept that is somewhat orthogonal. I could carry out a numerically unstable computation using interval arithmetic, but using interval arithmetic wouldn't magically make the computation stable, it would just give you error bounds (which may indeed be quite loose and unhelpful if the algorithm is unstable).
kragen•5h ago
From my point of view, the problem with numerical instability is that your program computes the wrong answer, and, worse, gives you no clue that it's wrong. Interval and other self-validating methods do fix that problem: the answer may not be useful, but at least it's not wrong, and they tell you there has been trouble.
constantcrying•1d ago
>I don't think you can ensure that with gradients though: if f and f' are computed in machine arithmetic, cancellation errors might pile up.

Yes, you can. You need to do automatic differentiation with interval arithmetic. This gives you a mathematically correct result for the derivative.

Always keep in mind with interval arithmetic that (-inf, inf) is also a mathematically correct result. Keeping intervals small is the challenge with interval arithmetic and means that every algorithm not specifically developed for it is likely going to lead to useless results.

helltone•1d ago
I think perhaps this could be done in other ways that don't require interval arithmetic for autodiff, only that the gradient is conservatively computed, in other words carrying the numerical error from f into f'
kragen•1d ago
This doesn't help if your computation for f is numerically unstable, unless you compute f with interval arithmetic too.
fph•1d ago
> You need to do automatic differentiation with interval arithmetic.

But that kinda defeats the point of replacing interval arithmetic with gradients, though.

3abiton•1d ago
It started interestingly, but then

> This blog post assumes a vague understanding of implicit surface rasterization, and how interval arithmetic is used to both skip regions of space and simplify complex expressions.

Can anyone give me a quick rundow of the article?

meindnoch•1d ago
Which part do you not understand?

An implicit surface is defined by F(x,y,z) = 0. Implicit surfaces can be rendered by checking whether the camera ray going through pixel x,y intersects the surface or not. An axis-aligned bounding box (AABB) in 3D is the product of intervals [x_min, x_max]x[y_min, y_max]x[z_min, z_max]. The interval extension of F(x,y,z) gives you an interval [F_min, F_max] for such an AABB. If [F_min, F_max] doesn't contain 0, then you can discard the whole AABB for rendering purposes, because the surface is guaranteed to be not there. This can speed up rendering substantially.

3abiton•17h ago
Kudos for replying, that's already quite helpful.
constantcrying•1d ago
>In this case, "the Lipschitz property" means that the gradient of the distance value is bounded

This is total nonsense. The point of Lipschitz continuity is that it is more than continuity and less then differentiability. If you assert that it is differentiable the concept looses all meaning. It is specifically interesting because you do not have to assert differentiability.

porridgeraisin•1d ago
Yes.

I think the author meant that the rate of change is bounded [without the function needing to be differentiable] but used the term "gradient" anyways. I guess you shouldn't use the term "rate of change" either? I guess

  The Lipschitz property means the function's  output doesn't change faster than linearly   with input
Is precise in that context
constantcrying•1d ago
He even gives the definition for the L=1 Lipschitz condition afterwards.

Lipschitz continuity means the differential quotient is globally bounded (where it exists). Differentiability, by definition, means the differential quotient converges at every point.

These are somewhat related, but different things.

kragen•1d ago
There are functions that are differentiable, but not Lipschitz, e.g. sin(1/x) on (0, ∞)). In this context the Lipschitz property allows you to do things that mere differentiability doesn't, and, in the context of assuming that your function is differentiable, the definition given for the Lipschitz property is correct. It's true that there are also functions that are Lipschitz-continuous but not differentiable, but in this context they aren't interesting.
constantcrying•1d ago
> It's true that there are also functions that are Lipschitz-continuous but not differentiable, but in this context they aren't interesting.

They absolutely are, even in the article. E.g. the max function or the 1-norm are non-differentiable, but continuous.

There is literally no point in talking about Lipschitz continuity for differentiable functions, because it is equivalent to the derivative being bounded.

kragen•1d ago
Hmm, I guess you're right. But those functions (highly relevant here!) are differentiable almost everywhere; is that enough for Lipschitz continuity to be equivalent to the derivative being bounded?

I still think there's a point in talking about Lipschitz continuity for differentiable functions because the algorithms they're talking about depend on the Lipschitz property, not the differentiability.

constantcrying•21h ago
>because the algorithms they're talking about depend on the Lipschitz property, not the differentiability.

That is why the definition should not contain the differentiability condition. It is violated by the examples and irrelevant.

My point is that the Lipschitz property, as defined e.g. in the Wikipedia article, is the important part. Also assuming differentiability is unhelpful, because it misses the point and is even violated by the rest of the article.

kragen•5h ago
You seem to have sedulously avoided answering my question.
ajkjk•22h ago
The article is talking about functions whose changes over an displacement dx are not just bounded by K |dx| for some K, but specifically by the value K=1. So it's not the same thing as the Lipschitz property in analysis, although I guess it's inspired by it--but as a result it's concrete and useful for algorithms.
constantcrying•21h ago
>The article is talking about functions whose changes over an displacement dx are not just bounded by K |dx| for some K,

It is also talking about functions which do not have a derivative at all and therefore could not possibly have that property. The point of Lipschitz functions is that they need not have a derivative.

>So it's not the same thing as the Lipschitz property in analysis, although I guess it's inspired by it--but as a result it's concrete and useful for algorithms.

That is just some arbitrary scaling. Only relevant to have the math look nicer. E.g. if f is Lipschitz for L=10, then f/10 is Lipschitz for L=1.

kragen•12h ago
> It is also talking about functions which do not have a derivative at all and therefore could not possibly have that property [that their changes over an displacement dx are bounded by K |dx| for some K]

Functions that do not have a derivative at all can certainly have that property. Perhaps you were confused by the notation "dx", suggesting a differential, for the displacement, rather than the conventional Δx, but the sentence makes no sense if we try to read it as a statement about differentials. Possibly your correspondent is not a Greek Keyboard Achiever. Moreover, the functions they are discussing, while indeed not differentiable, are continuous and differentiable almost everywhere, so you can also prove Lipschitz continuity if you can prove a bound on their gradients at the points where they do exist.

> That is just some arbitrary scaling. Only relevant to have the math look nicer. E.g. if f is Lipschitz for L=10, then f/10 is Lipschitz for L=1.

It's a relevant difference if you don't know the Lipschitz constant L or a bound on it, because then you don't know what to divide by.

constantcrying•2h ago
>Functions that do not have a derivative at all can certainly have that property. Perhaps you were confused by the notation "dx", suggesting a differential, for the displacement, rather than the conventional Δx, but the sentence makes no sense if we try to read it as a statement about differentials. Possibly your correspondent is not a Greek Keyboard Achiever. Moreover, the functions they are discussing, while indeed not differentiable, are continuous and differentiable almost everywhere, so you can also prove Lipschitz continuity if you can prove a bound on their gradients at the points where they do exist.

The Lipschitz property is not a property of functions with a derivative. I really do dislike this kind of drive by commenting, where you ignore an entire conversation and pretend someone does not understand what a gradient is.

ajkjk•11h ago
You seem really confused about something. The article is talking about SDFs that do not have that property, which it is then trying to force to have that property via a hack of sorts (f/|del f| apparently), because once the property holds it is very useful for making the algorithm for more efficient.

The arbitrary scaling is not arbitrary if the whole algorithm is defined around K=1. The Lipschitz property in analysis is "bounded by any K". The algorithm is specifically using "bounded by K=1".

Also: for the purpose of numerical methods, it does not matter if a function "has a derivative" in a classical sense or not. Certainly distributions, step functions, etc are fine. Numerical methods cannot, after all, tell if a function is discontinuous or not. The article is using some known closed-form functions as examples, but it is trying to illustrate the general case, which is not going to be a closed form but just an arbitrary SDF that you constructed from somewhere---which it will be very useful to enforce the K=1 Lipschitz condition on.

kragen•5h ago
I think there are probably different populations commenting here with different backgrounds, and consequently different assumptions and different blind spots. For example, for Newton's Method, it does very much matter whether the function has a derivative in the classical sense everywhere in the relevant interval; the quadratic convergence proof depends on it. Discontinuous functions can break even binary chop.
constantcrying•2h ago
This is about differentiability.

>which it is then trying to force to have that property via a hack of sorts (f/|del f| apparently)

The functions he talks about do not have a derivative. So that hack is nonsensical.

ajkjk•1h ago
Evidently they do? Maybe there is some domain knowledge you're missing here as you belligerent protest that it's nonsense.
mkeeter•1d ago
(OP here)

This is taken directly from the paper's introduction, which admittedly uses the more specific terminology of "1-Lipschitz signed distance bounds".

The paper cites the original Hart '96 paper on sphere tracing; quoth Hart, "a function is Lipschitz if and only if the magnitude of its derivative remains bounded".

https://graphics.stanford.edu/courses/cs348b-20-spring-conte...

I wonder if there's a terminology schism here between computer graphics and numerical analysis folks.

constantcrying•1d ago
>I wonder if there's a terminology schism here between computer graphics and numerical analysis folks.

The first group just pretends every function has a derivative (even when it clearly does not), the other doesn't.

The linked Wikipedia article gets it exactly right, I do not know why you would link to something which straight up says your definition is incorrect.

There is no point in talking about Lipschitz continuity when assuming that there is a derivative, you assume that it is Lipschitz because it is a weaker assumption. The key reason Lipschitz continuity is interesting because it allows you to talk about functions without a derivative, almost like they have one. It is the actual thing which makes any of this work.

sfpotter•19h ago
The concept of a Lipschitz function comes from mathematical analysis; neither computer graphics nor numerical analysis. It's straightforward to find the definition of a Lipschitz function online, and it is not in terms of its derivative. If a function is differentiable, then your quote applies; but again, it isn't the definition of a Lipschitz function.

I'd say this is a little pedantic, save for the fact that your function of interest (an SDF) isn't a differentiable function! It has big, crucially important subset of points (the caustic sets) where it fails to be differentiable.

srean•1d ago
> This is total nonsense... If you assert that it is differentiable the concept looses all meaning.

Citation please.

Yes Lipschitz functions need not be differentiable but when they are that bound holds and is very useful. Using that bound is bread and butter in convex analysis of differentiable convex functions.

Curb the snark.

constantcrying•1d ago
>Citation please.

The article links to the Wikipedia page, which gets it right.

"An everywhere differentiable function g : R → R is Lipschitz continuous (with K = sup |g′(x)|) if and only if it has a bounded first derivative"

Lipschitz continuity for differentiable functions is just having a bounded derivative. The Lipschitz property suddenly becomes uninteresting as it just falls out of the assumptions, the interesting fact which allows you to use non-differentiable functions is that not assuming differentiability, but assuming Lipschitz continuity, is enough.

>Using that bound is bread and butter in convex analysis of differentiable convex functions.

It also is the bread and butter in Analysis of PDEs, but it is the bread and butter because Lipschitz continuity is a weaker property than differentiability. In the context of the article you want to talk about non-differnetiable functions, e.g. max, which you couldn't if you assumed differentiability.

The reason this is important because choosing Lipschitz over differentiable is what makes all this work.

srean•1d ago
Yes we know that. I object to 'loses all meaning' when the function is differentiable. It certainly doesn't.

It gives the special property that the derivative is bounded, a property that's not true for arbitrary differentiable functions.

constantcrying•21h ago
My point is that the article is using the Lipschitz property to get its results. This makes it unnecessary and even wrong to introduce Lipschitz continuity only for functions with a derivative. Especially since the article actually uses functions which do not have a derivative.
sfpotter•19h ago
For what it's worth, what you originally said was: "This is total nonsense." The points you're making are valid, but it isn't "total nonsense". Something not being exactly factually correct doesn't mean that it's "total nonsense". Different publications adhere to different levels of rigor. Just because it doesn't meet your own personal standard doesn't make it nonsense for the target audience of the article.
dataflow•22h ago
> This is total nonsense. The point of Lipschitz continuity is that it is more than continuity and less then differentiability. If you assert that it is differentiable the concept looses all meaning.

How is Lipschitz less than differentiability? e^x is differentiable everywhere but not Lipschitz continuous everywhere, right?

constantcrying•21h ago
>How is Lipschitz less than differentiability? e^x is differentiable everywhere but not Lipschitz continuous everywhere, right?

You should compare the local properties.

dataflow•21h ago
Nobody said local anywhere though, right? You just said the statement was nonsense where it clearly makes sense at least globally.

But now that we've moved the goalposts to local: what about f(x) = x^(3/2) sin(1/x), f(0) = 0? It's differentiable yet not locally Lipschitz-continuous on (0, 1]. (Taken straight from Wikipedia)

constantcrying•7h ago
>Nobody said local anywhere though, right?

I was speaking in very general terms about the subject, I was not making mathematically correct statements. Your are completely correct about the counter example.

You actually need some stronger assumptions, namely that f is C^1, meaning it has a continous derivative. In that case any function f defined on a compact set K has a continuous derivative f' which assumes its maximum on K, giving Lipschitz continuity. Every function in C^1 on K is Lipschitz, yet not every Lipschitz function is C^1 or even differentiable.

To be more specific and to give the reason Lipschitz functions are important, look at the following statement from Wikipedia. "A Lipschitz function g : R → R is absolutely continuous and therefore is differentiable almost everywhere, that is, differentiable at every point outside a set of Lebesgue measure zero."

Given this I hope that you can understand what I said about Lipschitz functions being a weaker version of differentiabilty.

qbit42•22h ago
If a function is Lipschitz, then it is differentiable almost everywhere by Radamacher's theorem. Moreover, if you want to prove things about Lipschitz functions, you can often (though not always) prove things about continuously differentiable functions with bounded gradients, and then lift to all Lipschitz functions via an appropriate compactness argument.
sfpotter•19h ago
This is a great point... and Rademacher's theorem is one of my favorites. But this is in the context of a numerical algorithm which proposes to replace interval arithmetic with a direct evaluation of a gradient. For an SDF, the gradient fails to exist in many places; and since we're working with it on a computer and not doing math, this algorithm will eventually evaluate the gradient at a point where it is undefined (finite number of points, p > 0). On the other hand, it's straightforward to come up with a useful interval inclusion for the gradient which is defined even in these places (it should contain the subgradient of the function at each point). So, I am personally not convinced of the value of the proposed approach.
qbit42•17h ago
Yeah in context I somewhat agree, though the utility for graphics applications probably comes down to some more empirical aspects that I won't conjecture about. I imagine there is some stuff you could do in this setting by incorporating autograd derivatives from many slightly perturbed points in a neighborhood of the input point (which together act as a coarse approximation of the subdifferential set).
constantcrying•7h ago
Exactly! That is why it is totally superfluous to define Lipschitz continuity in terms of differentiable functions.
yorwba•1d ago
The suggested normalization procedure, even with the ad-hoc fix for gradient discontinuities, doesn't actually ensure that the resulting function is 1-Lipschitz unless the gradient of the gradient magnitude vanishes. The signed-distance functions considered in the article seem to have piecewise constant gradient magnitudes (so are L-Lipschitz, just with L > 1) except for inside the "r", but for less well-behaved functions, higher order derivatives might start to matter.
kragen•1d ago
This is a good point!
meindnoch•23h ago
Sure, but signed distance fields by definition have a constant gradient magnitude, aren't they? They measure the distance from the surface, which grows linearly in the normal direction.

But it is true that general implicit surfaces don't necessarily have constant magnitude gradients. I.e. F(x,y) = x^2 + y^2 - 1 vs. F(x,y) = sqrt(x^2 + y^2) - 1

diabllicseagull•1d ago
I've worked on a patent some years ago about SDF CSG Tree pruning and constant radius filleted blends. Sadly patents don't get the same visibility journals enjoy.

https://patentimages.storage.googleapis.com/7a/73/2d/8d2eeca...

marcosdumay•1d ago
If you are after visibility, you can always do both. Bot any patent will push people away from your work.
jacobolus•23h ago
In theory, the purpose of a patent is to encourage sharing inventions so people can build on each-others' work. In practice, the modern purpose of a patent is to claim ownership over ideas and block further innovation so the author can extract rents via the court system; they are typically written to be as vague and inscrutable as possible to help cover a wider range of possible alternative inventions someone else might come up with, with no incentive for clarity. There's generally little reason for someone who isn't a lawyer to read a patent that hasn't expired yet – still a decade away in this case.

A paper is usually better: the goal is very explicitly sharing knowledge, and there are peer reviewers and editors whose job is to make sure the writing is clear.

mitthrowaway2•23h ago
One reason for reading patents that haven't expired yet is if you're trying to evaluate an offer from a startup which has patents on their core technology. It can be worth understanding how strong or weak their technology position is.
guyomes•23h ago
A generalisation of this idea is known as Taylor model in 1998 [1]. It might even have been known in 1984 as neighborhood arithmetic [2]. The generalisation works by taking a Taylor expansion of the function up to order n, and then by using a bound for the remainder using bounds on the partial derivatives of order n+1 [3].

[1]: https://www.bmtdynamics.org/cgi-bin/display.pl?name=rdaic

[2]: https://books.google.fr/books?id=2zDUCQAAQBAJ

[3]: https://en.wikipedia.org/wiki/Taylor%27s_theorem#Taylor's_th...

sfpotter•21h ago
Worth pointing out that these ideas were already well known by Moore, the founder of interval arithmetic. Chapter 3 of his monograph "Methods and Applications of Interval Analysis" has the basic ideas worked out.

The folks working on the Taylor model were coming from physics where they had some nasty (very ill-conditioned) nonlinear systems that they needed to solve and developed a package called COSY which implements it.

My understanding is that the Taylor model is effective in practice, but there might have been some confusion around what it was actually capable of. I believe the Taylor model people claimed that their interval inclusions had better than quadratic excess, but this turned out not to be the case. Other people were able to push past the quadratic limit of the well known interval inclusions using other techniques. There are some recent papers by Hormann and Yap along these lines, although I think the first interval inclusion that is better than quadratic dates back further...

yorwba•11h ago
I assume "recent papers by Hormann and Yap" is a reference to e.g. Range functions of any convergence order and their amortized complexity analysis https://www.inf.usi.ch/hormann//papers/Hormann.2023.RFO.pdf ? Cool stuff.
sfpotter•58m ago
Yeah that's it. There might be a couple others? That paper should also have references back to earlier papers on higher order inclusion functions.
ajkjk•22h ago
Don't know much about this kind of thing, but it seems like it would be useful, if possible, to store the actual displacement (dx,dy) vector to the surface instead of just its magnitude, the SDF. I can't evaluate the tradeoff of storing an extra value, but it seems like it would make the recursion in the first section a lot easier: when you want to test if a particular box contains the object, if [v+d, v-d] contains 0, you would know a lot more where to look: you need only continue the iteration on subdivisions that are in the correct direction.
sfpotter•21h ago
Cool post.

The Lipschitz trick here relies on the assumption that the gradient has magnitude 1 everywhere. Evaluating the SDF in a box and using the Lipschitz property lets you get some quick estimates on the range of the SDF over the box.

This is a close cousin of the monotonicity interval inclusion: e.g., if f'([a, b]) > 0, then f([a, b]) is a subset of [f(a), f(b)]. Rounded interval arithmetic makes this rigorous on a computer; you can also do it in multiple dimensions, with higher derivatives, successively tightening inclusions for lower derivatives, etc.