A rudimentary simulation of the three-body problem

219 points17
antognini17 days ago

If anyone is interested in playing around with gravitational dynamics I highly recommend the REBOUND library [1]. It has a simple Python interface but under the hood it has a collection of research grade integrators. It came out around the time I was finishing up my PhD in gravitational dynamics, but if I were still in the field it's what I'd be using.

If you're curious what would happen to the Solar System if you made Jupiter 10 times more massive you can pip import the library and find out for yourself in about five minutes.


lucgommans17 days ago

It's a hundred times less polished than rebound (that readme looks seriously cool!), but in 2021 I also wanted to toy around with orbital mechanics / gravity and couldn't find a quick and easy simulator so I made this to run in a browser:

Since it's so rough on the edges (especially on mobile, initially I was surprised it works at all), here's the steps for the mentioned example of making Jupiter 10x heavier:

    1. Open the scenarios on the left and click play on the inner solar system to load that up
    2. Click the plus on the outer planets to add them in (if it looks like nothing happened: zoom out. Space is big and this is to scale)
    3. Fold out the "bodies" section and alter the mass for "J"upiter. The change is applied live.
    4. Optionally press Restart to restart with the current settings but back at their initial positions and speeds
Making Jupiter 1000× heavier (and fast-forwarding the time in the Simulator controls by 10×) makes it eject Mars from the solar system within one minute, but interestingly Mercury and Venus seem pretty stable around the sun in that configuration

The help/about page ( contains links to all other orbit projects I could find. Seeing Rebound as well as the OP, I should probably add a "libraries" section! Or do you think that should just go with "Software to download" alongside Stellarium and such?

ricksunny17 days ago

Looks awesome. I have a question that might be pedantic but I think you could speak to. Coming from an engineering mindset, I like the use of the term 'integrator' instead of 'solver'. In MATLAB, 'solvers' are used to iterate states of ODE models. But the term 'integrator' is more intuitive to me. Can you speak to the use of one term vs another in the ephemeris community?

constantcrying17 days ago

Solver is a general term, it is just an algorithm which solves a certain problem. You have solvers for linear systems, PDEs, optimization problems, graph problems, etc.

An integrator is an algorithm which allows the numerical approximation of the solution to an ODE, given that the ODE is written in a specific form where it is equivalent to calculating the integral of multiple functions.

dreamcompiler17 days ago

You can solve differential equations by integrating or by differentiating. Real-world integrators were easier to build back when DEs were solved with analog computers. Although "easier" is an understatement: Differentiators have a nasty habit of trying to blow up to infinity, which means you can't really build a good, general differentiator either with analog or digital electronics.

Integrators are much better behaved pets and they don't shit on the carpet. So everybody uses integrators. Integrators have lots of issues too but those can be sufficiently mitigated for many classes of problems. Differentiators are mostly hopeless, feral beasts.

mnw21cam16 days ago

Are you sure you got these the right way round? It seems logical that something that integrates over time should veer off towards infinity eventually, but something that differentiates should be stable.

dreamcompiler16 days ago

Integrals are summers so they can increase without bound over time. But if the input function varies around y=0 -- as many do -- they can remain stable.

Now let's talk about infinities that can happen instantly: What's the derivative at the upward edge of a square wave?

nonfamous17 days ago

This account has been posting simulations of interesting 3-body scenarios for quite a while. It used to be on Twitter but moved to Mastodon. You can check out the archives and play the videos, it’s quite neat:

hackernewds16 days ago

How is it random if the 3 body problems are chosen to have overlapping ellipses that look beautiful

itishappy16 days ago

The Mastodon videos have random initial conditions, the GitHub example does not. Are you checking the right link?

pixelesque16 days ago

Rejection sampling? /s

Hugsun17 days ago

I love this bot, thanks for sharing!

737373737317 days ago

There is this cool paper "Crash test for the restricted three-body problem" [0], that plots where the third body eventually ends up when dropped from any location. Looks very fractal-like [1][2]




jameshart17 days ago

‘Strange attractors’:

“An attractor is called strange if it has a fractal structure.”

cl3misch16 days ago

Thank you for reminding me of this beautiful field of mathematics.

The best textbook I've ever read: Nonlinear Dynamics and Chaos by Steven Strogatz.

mock-possum16 days ago

Looks like fluid dynamics / turbulence to me

DrFalkyn16 days ago

E&M and gravity are both n^2 forces, but E&M has polarity, whereas gravity (at least we encounter it) does not.

joe_the_user17 days ago

One of the first programs I ever wrote was a simulator for a planet rotating a star with a naive difference equation approximation to Newton's law. I was a bit disappointed to see the planet reliably spiral into the sun.

The main thing is that something like Euler's method (naive iterative approximation) doesn't guarantee conservation of energy. I believe that this is why planetary dynamics are usually handled with Lagrangian equations rather than the naive approximation approach.

Edit: It would be nice to see what the author's system does for two bodies as a sanity check. Three body system was indeed chaotic but still conserve energy - would this system do that?

dekhn17 days ago

It wasn't the first program I wrote, or even a program I wrote, but in middle school a friend wrote a 3-body integrator in BASIC (sun, earth, moon). That single 20 line program shaped my entire world view for a long time (decades), implying to me that we could, if we had powerful enough computers, simulate all sorts of things... even entire universes (which was also an idea that I explored with cellular automata).

It's not a particularly helpful worldview and can often be harmful if you're working with complex systems, or systems that require more than o(n(log(n)) per step, or any number of other real-world problems scientists face.

Many years later I was impressed at how well astronomy packages work (IE, "it's time T at local L, what is the angle of the sun in the sky?") and stumbled across this paper by Sussman: which shows some pretty serious work on future prediction of solar system objects.

forgotpwd1617 days ago

>simulate all sorts of things... even entire universes

You also assumed that chaos is a measurement problem. You could simulate entire universe if you knew the initial conditions sufficiently enough. There were two nice recent papers[1][2] that showed in order to predict some orbits you'll need an accuracy less of Planck length or else some systems are fundamentally unpredictable.

[1]: [2]:

dekhn17 days ago

I'd love to see convincing evidence that we could simulate the universe using only standard physical laws. IIUC we don't have a way to do that or reliably say whether it's possible. It's also not that interesting a problem because it's so impractical.

actionfromafar17 days ago
bufferoverflow17 days ago

> if we had powerful enough computers, simulate all sorts of things

In reality, we're having a hard time precisely simulating even two atoms interacting, with all the quantum effects, diffraction, gravity (however minuscule), etc.

Our universe is surprisingly detailed.

64-bit floats aren't even close enough to precisely simulate real world. What's the precision of the mass of an electron? What's the precision of its coordinates or motion vectors? Maybe plank length for coordinates, maybe not. What about acceleration? Can it be arbitrarily low? An electron's gravitational field from a billion light years away should theoretically affect us (in time).

xvector16 days ago

The assumption in your comment is that any of this is real to begin with and logic isn't being short-circuited in our brains to make everything "check out" even if it doesn't.

If you simulate a universe with cube blocks from Minecraft, it doesn't matter as long as your users think the simulation is real.

And since you are simulating their consciousness, you can easily short circuit the train of thought that would cause doubt, or that would attempt logic, etc., so they truly believe their Minecraft cube world is incomprehensibly detailed down to the atoms and galaxies in the sky.

They'd happily go on the whiteboard, and prove their theories with math like 2+2=5 and everyone would agree because they literally couldn't disagree - they would feel in their hearts and minds that this is perfectly correct. There's nothing to say that's not happening now.

In fact, this is how I see most advanced civilizations performing simulations. The compute savings would be immense if you could just alter user consciousness as opposed to simulating an actual universe.

genrilz16 days ago
bufferoverflow16 days ago
dekhn16 days ago

one imagines that post-singularity overloads don't have to worry about IEEE754. Float is likely not the right representation here, but double is enough to represent solar-system-scale differences at centimeter precisions.

bufferoverflow16 days ago

> at centimeter precisions

So yeah, you're 33 decimal orders of magnitude off from the Planck length. And that's assuming that Plank length is the smallest possible length.

So you'd need at least 117 extra bits to get your representations precise. And that's just for our solar system.

For the observable universe (~93 billion light years across) you'd need 206 bits of precision.

FredFS45617 days ago

I think symplectic integrators are typically used, which are derived from hamiltonian mechanics

zokier17 days ago

It's true that Euler integration is about as crude as you can get, but you don't need to reach to Lagrangians for improvement; something like Verlet integration can already bring dramatic gains with fairly small changes needed.

_0ffh17 days ago

Yes, I think it's probably the simplest symplectic method, which would be quite the improvement already.

forgotpwd1617 days ago

Can convert Euler's method to a symplectic integrator utilizing v_{n+1} when computing x_{n+1}. That said although such integrators are widely used (usually of higher order than Euler's) in celestial mechanics, one is not restricted to them. For example Bulirsch-Stoer is also very used even if it isn't symplectic because remains accurate (energy error very low) even on long integrations.

PeterisP17 days ago

Would it make sense to explicitly implement conservation of energy?

I.e. do a simple method but calculate the total energy at the beginning, and at each step adjust the speeds (e.g. proportionally) so that the total energy matches the initial value - you'll still always get some difference due to numerical accuracy issues, but that difference won't be growing over time.

comicjk17 days ago

The method you describe would be an example of what is called a "thermostat" in molecular dynamics (because the speed of molecules forms what we call temperature). Such adjustments to the speed can definitely paper over issues with your energy conservation, but you still have to be careful: if you rescale the speeds naively you get the "flying ice cube" effect where all internal motions of the system cease and it maintains its original energy simply by zooming away at high speed.

montecarl16 days ago

Thermostats ensure that the average _kinetic energy_ remains constant (on average or instantaneously depending on how they are implemented). Your parent post wants to enforce the constraint that the total energy remains constant. So its a bit different from a canonical ensemble (NVT) simulation. This is a microcanonical ensemble simulation (NVE). This means you don't know if you should correct the position (controlling the potential energy) or the velocities (controlling the kinetic energy).

Basically, there will be error in the positions and velocities due to the integrator used and you don't know how to patch it up. You have 1 constraint; the total energy should be constant. There are 2(3N-6) degrees of freedom for the positions and velocities (if more than 2 bodies). The extra constraint doesn't help much!

Edit: Also, the only reason thermostats work is because the assumption is that the system is in equilibrium with a heat bath (i.e. bunch of atoms at constant temperature). So there is an entire distribution of velocities that is statistically valid and as long as the velocities of the atoms in the system reflect that, you will on average model the kinetics of the system properly (e.g. things like reaction rates will be right). In gravitational problems there is no heat bath.

at_compile_time17 days ago

If you want to demonstrate why the three-body problem is chaotic, you can set it up to run a couple hundred very similar simulations in parallel. Just nudge each body by a tiny amount to simulate uncertainty in the initial conditions and watch the resulting configurations diverge as tiny differences become large differences. Rather than points, you get lines of probability that stretch out and wrap around each other. It's quite striking.

Edit: semantics

constantcrying17 days ago

This is irrelevant to the unsolvability.

That the three body problem is unstable and that no analytic solution exists are completely independent statements.

The upright pendulum is also an unstable ODE, yet it has an analytical solution.

jovial_cavalier17 days ago

This only demonstrates that the system is chaotic, not that there is no closed form solution.

nyrikki17 days ago

This may be a bit pedantic, the nbody problem is not chaotic, it is harder, having riddled basins.

> A riddled basin implies a kind of unpredictability, since exact initial data are required in order to determine whether the state of a system lies in such a basin, and hence to determine the system’s qualitative behavior as time increases without bound. (Note this is different from “chaos,” where very precise initial data are required to determine finite-time behavior.) What is more, any computation that determines the long-term behavior of a system with riddled basins must use the complete exact initial data, which generally cannot be finitely expressed. Hence such computations are intuitively impossible, even if the data are somehow available.

The above post is a good 'example' of sensitivity to initial conditions, and riddled basins do have a positive Lyapunov exponent which is often the only criteria in popular mathematics related to chaos. But while a positive Lyapunov exponent is required for a system to be chaotic, it is not sufficient to prove a system is chaotic.

If you look at the topologically transitive requirement, where you work with the non-empty open sets U,V ⊂ X....riddled basins have no open sets...only closed sets.

With riddled basins, no matter how small your ε, it will always contain the boundary set.

If you have 3 exit basins you can run into the Wada property, which is also dependent on initial conditions but may have a zero or even negative Lyapunov exponent and is where 3 or more basins share the same boundary set...which is hard to visualize, non-chaotic, and nondeterministic.

Add in strange non-chaotic attractors, which may be easier or harder than strange chaotic attractors, and the story gets more complicated.

Sensitivity to initial conditions is simply not sufficient to show a system is chaotic in the formal meaning.

But the 3 body problem's issues do directly relate to decidability and thus computability.

jovial_cavalier17 days ago

This is all very interesting stuff, and I thank you for a bunch of new keywords to google, but I’m not sure why you say it’s not chaotic.

As far as I understand, extreme sensitivity to parameters/ICs is all that is required for a system to be chaotic.

nyrikki16 days ago

That was once a popular belief, but we have moved past that historical concept.

Here is a paper that is fairly accessible that may help.

It becomes important when you have a need to make useful models, or to know when you probably won't be able to find a practical approximation.

It is similar to the erroneous explanation of entropy as disorder, which is fundamentally false, yet popular.

It has real implications, like frustrating the efforts to make ANNs that are closer to biological neurons:

Or even model realistic population dynamics.

> It has been shown how simple ecosystem models can generate qualitative unpredictability above and beyond simple chaos or alternate basin structures.

Chaotic, riddled, and wada can be viewed as deterministic, practically indeterminate, and strongly indeterminate respectfully.

If you want to hold on to the flawed popular understanding of the butterfly effect that is fine, you just won't be able to solve some problems that are open to approximation and please don't design any encryption algorithms.

I think realizing it is simply a popular didactic half truth, is helpful.

ncallaway17 days ago

But the chaos is extremely critical, since we can’t ever perfectly measure the initial conditions.

So, even having a closed form solution isn’t helpful when computing real world situations.

constantcrying17 days ago

The statements are independent. It having a closed form solution and it being unstable don't contradict or confirm one another.

In solutions to ODEs converge very often exponentially from the true result. That the 3 Body problem for this makes it characteristic, not special.

>So, even having a closed form solution isn’t helpful when computing real world situations.

Simply not true. It is helpful or not depending on your problem. Often you are interested in short term behavior, which can be studied by numerical methods or, if existing, analytic solutions.

PaulHoule17 days ago

Closed form solutions in general dynamic systems are possible when systems are integrable which means there is a conserved quantity for every degree of freedom. The solar system is almost like that in the case that each planet keeps going around with a constant angular momentum so they go around like a set of clocks that run at different speeds. Over long periods of time there is angular momentum transferred by the planets so you get chaos like this

Orbital mechanics is a tough case for perturbation theory because each planet has three degrees of freedom (around, in and out, up and down) and the periods are the same for all of these motions and don’t vary with the orbital eccentricity or inclination. Contrast that to the generic case where the periods are all different and vary with the amplitude so with weak perturbations away from a resonance the system behaves mostly like an integrable system but if the ratio between two periods is close to rational all hell breaks loose, see

the harmonic oscillator has a similar problem because the period doesn’t change as a function of the amplitude. Either way these two pedagogically important systems will lead you completely wrong in terms of understanding nonlinear dynamic, if you add, say, an εx^3 term to the force in one of two coupled harmonic oscillators it is meaningless that ε is small, you have to realize that the N=2 case of this integrable system

is the right place to start your perturbation from which ends up explaining why the symmetry of the harmonic oscillator breaks the way that it does. Funny though, the harmonic oscillator is not weird at all in quantum mechanics and is just fine to do perturbation theory from.

why_at17 days ago

Thanks for this. This is something that always bugged me when I see explanations of the three-body problem. They'll say something like "changing the initial conditions just a tiny bit can dramatically change the outcome!" as an explanation for why having no closed form solution is significant.

But that never made sense to me, since plenty of things with closed form solutions also do this.

zardo17 days ago

It was really a math problem and not a physical one. There's so much more going on ( GR, radiation pressure ), that even if there were a solution it wouldn't be able to predict Mercury's orbit.

adastra2217 days ago

FYI there are stable configurations to 3+ body systems. Not all configurations are chaotic.

The_Colonel16 days ago
mistermann16 days ago

> This only demonstrates that the system is chaotic, not that there is no closed form solution.

This seems a bit off, it seems like[1] an implicit assertion ("only(!) demonstrates") that it is not possible for a system that lacks a closed form solution in fact (beyond our ability to discern) to be demonstrated.

To be clear I'm in no way implying this was your intent (I see it as an interesting "quirk" of our culture)...I'm mainly interesting if you can see what I'm getting at.

As a thought experiment, stand up two instances: one is our current situation (inability to discern, indeterminate), the other where we have (somehow) proven out (or, come to believe we have, reality being Indirect but experienced as Direct, thus: "is "proven", thus: "is") that a closed form solution is not possible: would the second instance "be(!) a demonstration that the system has no closed form solution"? (Thinking more....I think maybe the choice of the word "demonstrate" may very well make a path to seeking the truth of the matter ~impossible to achieve in these sorts of cases, especially if one takes cultural forces[2] into consideration).

[1] Using "pedantry", which few people understand the technical meaning of, and tend to flip flop on depending on what is being considered (precision & accuracy in science/physics is good, precision & accuracy in philosophy/metaphysics is bad - no explanation or justification needed: a Cultural Fact).

[2] Which make the 3 body problem in the known to be deterministic physical realm seem like child's play.

jovial_cavalier15 days ago

I'm not sure I understand what you're saying.

>As a thought experiment, stand up two instances: one is our current situation ..., the other where we have (somehow) proven out ... that a closed form solution is not possible

As far as I understand, this has in fact been proved. Quite a long time ago, too, by Poincare I believe.

GP has edited his comment to reflect my feedback, but originally said that his experiment "demonstrates that there is no solution." All I was trying to point out is that the two concepts are not necessarily related.

You could imagine some system x' = f(x), where f(x) is some transcendental function. There is no analytic solution to this system, but it's obviously not chaotic.

Could you imagine a system that is chaotic but does have an analytical solution? I'm not sure. Closest I could find to answering this was:

I'm sure he understood this. I only commented to try and minimize the confusion of others.

edit - This article suggests that the logistic map (a system famously used to introduce the concept of chaos) has an analytical solution:

fayalalebrun16 days ago

A few years ago a friend and I made something similar to universe sandbox, though only with the gravitational simulation part:

Surprisingly enough the jar still runs without issue. Something which probably would not be the case for linux binaries, but maybe for windows.

a_gnostic17 days ago
zamadatix17 days ago

KSP doesn't actually do > 2 body simulations allowing the paths to follow closed form solutions. This is why you have an abrupt orbit change between bodies instead of just a freeform path. There is a mod "Principia" which adds this functionality in but bewarned it makes the game very different to play!

Georgelemental17 days ago

Principia is extremely impressive, they document all their math here:

One interesting detail is that the source code makes extensive use of non-ASCII identifiers, for mathematical symbols and for the names of mathematicians. One of the two primary contributors is also an active contributor to Unicode

BlueTemplar16 days ago

Thanks, I was wondering how feasible this was / how one would do that in C++ !

wkat424216 days ago

KSP even while simplified is so amazing.

I've never gotten very far but the one thing it did manage to impress extremely thoroughly on me is "space is hard". And it's like 5x easier in KSP than on earth lol.

Also it showed me that that ever recurring thought of "why don't they just..." is usually pretty misguided.

I really respect how they managed to make this fun and so incredibly educational at the same time.

xyst17 days ago

Netflix really going hard on pushing their IP. It’s like guerilla marketing on steroids.

I jest. Tbh, I didn’t know this was an actual problem. Thanks for sharing.

QuantumG17 days ago

I've used the NAIF SPICE toolkit ( which includes the Yarkovsky effect and such, important for small bodies.

nico17 days ago

This is great! Thank you

In college, a long time ago, I wrote something like this, for n bodies, but in c++ and OpenGL

More recently I’ve built something similar in python

For anyone interested in this, I recommend this Wired article that goes from the 2 body problem to n, with simulations and code that run on the browser:

rsynnott17 days ago

Do you want ghost numbers counting down on your retinas? Because this is how you get ghost numbers counting down on your retinas.

lagadu17 days ago

It's ok, I'll just make sure never to measure the CMB for variations.

mafuku16 days ago

I've always read about how astronomers, especially after Newton, could predict the movement of the planets with incredible precision, yet n-body problems are so infamously complex and hard to solve, even now. Why is that? Is it simply that the mass of everything except for the sun is just so negligible?

aoanla16 days ago

N-body problems can't be analytically solved. However, you can still compute integrals into the future (with some acceptable error), you just need to step through all the intermediate states along the way

In the case of the solar system, yes, it helps that the Sun is much more massive than everything else (and then Jupiter is 4 times more massive than Saturn, the next biggest) - you can go a long way to a "reasonable" solution by starting with the 2-body solution if only the Sun affected each planet, and then adding in the perturbation caused by Jupiter and Saturn. (In fact, that's how we predicted the existence of Neptune, by noticing that there were extra perturbations on Uranus beyond those, and hence another massive planet must exist, far enough away from the sun to only significantly affect Uranus).

octachron16 days ago

Predictions of the solar system state are accurate only on "short" periods. The solar system is chaotic, and predicting its state after few million years is no more possible that predicting the weather for next year. This does not preclude making very accurate on short period.

Note that n-body problems are not particularly complex or hard to solve compared to other chaotic systems. In many ways, the existence or nonexistence of closed-form solutions is mostly a distraction: it merely reflects our choice of primitive functions and there is no sets of primitive functions that is stable for addition, multiplication, composition, inversion and integration. Typically, even the simple integral ∫ eˣ/x dx cannot be decomposed into more elementary functions.

But that doesn't matter in practice, because we are already using numerical approximation to compute primitive functions that are not implemented in hardware. Using numerical solvers to compute solutions to ODE is not so different. A good illustration of that point is that there is an analytic solution for the 3-body problem (in the form of an infinite series in t^{1/3}). But this solution is useless for computing orbits because it has bad convergence properties. In other words, it is better to use a numerical solver rather than stick to the analytical solution. And a similar phenomenon exists for polynomial equation of degrees 3 and 4: the exact formula is numerically unstable, and its better to use a numerical solver when one wants a numerical solution.

DrFalkyn16 days ago

IAMAP, but n-body problems result in non-linear partial differential equations, to which only a few special cases are known, and even that case its do to simplifications, e.g., treating planets as point masses and ignoring tidal forces, or ignoring the pull of the planets on the Sun, (or on each other).

One such case where a solution is known is the Lagrange point of the Earth-Moon-Sun system (and similiarly for other points) But in reality, they exist only as an approximation. They aren't truly stable.

My understanding the way to calculate spacecraft, asteroid, etc. trajectories is just through a discrete simulation.

Like f you don't know how to solve the antiderivative of a given function, you can still calculate the integral since you know the value of the function.

rcxdude16 days ago

To add to the points from others: 3+-body systems are generally chaotic, so you cannot predict arbitrarily far into the future, but the solar system is reasonably well-behaved in that manner, so the timescales are long, but we don't know where the planets in the inner solar system will be in their orbit in ~5-10 million years (as in, that's the timescale where the error bars for the position in the orbit span the whole orbit). Of course, if you care about more precise predictions then the timescales are shorter: eclipse predictions more than 1000 years in the future are likely to be quite inaccurate.

constantcrying16 days ago

For almost any differential equation there is no analytical solution for it's initial value problem. That the n-Body problem behaves that way is unsurprising and poses no inherent challenge to making predictions.

Computers can easily solve initial value problems for most ordinary differential equations. They integrate them, calculating an approximate solution after every small, but finite step.

Getting an approximate solution to the 3-Body problem can be achieved in around 20 lines of python, without having to use any libraries. It is a remarkable simple and effective technique.

piuantiderp16 days ago

You can solve it numerically using Finite Differences. Basically using linear approximations

dreamcompiler17 days ago

If you want to take the next step up in accuracy and cleverness, investigate the work of Mr. Runge and Mr. Kutta.

zelphirkalt16 days ago

Is there some way of determining, whether the orbits, given some starting parameters, _at some point_ will become stable? I probably mean periodic. And I probably mean mostly periodic, with only slight deviation from orbits. And I don't necessarily mean that we can calculate/predict what they will look like in a stable form.

This reminds me of Langtons Ant, that has very simple rules, but at first still seems very chaotic. Then after some number of iterations it just shoots away in a regular endlessly regular repeating pattern. "Order came from chaos." So it makes me think, maybe there is not a way to tell, whether the orbits "stabilize" at some point, but maybe they will, and we simply don't know when or how to tell?

badrunaway16 days ago

What does three body problem tell us about our universe? It surprises me that universe by its nature ended up with such smart safeguards to disallow predictability via computation.

zakhar17 days ago

Hah, I did something similar at!/-2/-1/three-body after reading the book last year.

Source at

airstrike17 days ago

Loved your website. Thanks for sharing.

alex_duf16 days ago

Completely selfish plug, but it's not very hard to implement, so here's my implementation using p5js:

you can play around with the code. Clicking generate a new solar system. There are a few constants you can adjust at the top of the file.

JKCalhoun17 days ago

And since A. K. Dewdney is fresh on my mind, he did a Computer Recreations article generally about this (about simulating orbital mechanics) and the clever bit that I remember: you dial the time slice way down as objects got close, you care little when the objects are far apart.

Not a "solution" of course, but certainly an optimization if you're just generally doing gravitational simulations.

pizza16 days ago

If you want a good read that (to summarize it quite tersely) takes the idea of 3- or n-body simulations and goes very far indeed with it (i.e. why there is something rather than nothing), I highly recommend checking out Julian Barbour's The Janus Point.

yzydserd17 days ago

Also see the source code for the popular ThreeBodyBot [0] as seen on mastodon etc [1]

It contains a numerics tutorial [2] that I found very useful for my use case.



[2] (ipynb)

Gys17 days ago

In this simulation the bodies also collide together. Luckily that never happened in the book.

CWuestefeld17 days ago

The book is mis-named. Their system was not 3 bodies, but 4 (3 stars, plus a planet). And the system is so chaotic that even that little planet will make a huge difference over time. And even beyond that, the bodies themselves were transformed, having the ability to tear atmosphere away from each other.

marginalia_nu16 days ago

Chaotic N-body behavior appears when the bodies have similar mass. In that scenario, every solution is either chaotic or unstable.

The solar system is strictly speaking a 20+ body system. That said, the behavior of the solar system is fairly predictable because the the sun has almost all of the mass, and jupiter has almost all of what remains, everything else is a small correction term. We can to a good approximation calculate the other satellites' orbits around either the sun or the center of mass of the sun-jupiter system.

SamBam17 days ago

I was wondering about this too, but I think they're not colliding. I think they're pulling towards each other and getting tightly pulled around each other so they end up slingshotting and flying back the way they came, in a way that looks like a bounce.

You can see this in the very beginning of the simulation, with the blue and green dot.

Can anyone say if this is actually accurate? It seems like an unintuitive motion to me, but I'm often surprised by how these things work.

achristmascarl17 days ago

i believe you're correct, other (higher resolution) visualizations of periodic orbits show that "wrapping" behavior more clearly.

example from wikipedia:

woooooo17 days ago

Luckily??! Could have saved everyone a lot of trouble and kept them in the proper number of dimensions.

aaronbrethorst16 days ago

Arguably this was all Mao Zedong's fault.

maelito17 days ago

Should a collision conduct to a simulation of a 4 body problem ?

achristmascarl17 days ago

that might just be because of my low resolution gif ;)

micheljansen16 days ago

I've been re-reading the trilogy lately and it also lead me to search for some three-body simulations. I noticed that a lot of them are simulating only a 2D space. Any good (animated) 3D simulations? Best I've been able to find so far are and

yieldcrv16 days ago

isn’t that story a 4-body problem?

micheljansen16 days ago

Not a physicist, but apparently the planet does not really have enough mass to matter:

huksley16 days ago

I was wondering why it called a three-body problem (at least as it presented in Liu Cixin novel). There is actually at least 4 bodies (3 suns and a planet), right? I suppose the planet will affect the movement of the 3 suns.

forgotpwd1616 days ago

Probably a misnomer. And issue isn't really about the planet affecting the movement of the 3 suns (its mass can be considered deligible compared to them) but how the suns affect the movement of the planet. Essentially they're looking for a solution to the restricted 4bp rather the 3bp.

kqr16 days ago

No, it's just three. The reason it's a thing – as far as I understand – is that solving analytically for one body is trivial: it's stationary. Solving analytically for two bodies is possible but takes some calculus. Three bodies have chaotic behaviour and need to be simulated.

belst16 days ago

in the novels there are 3 suns, so it is 4 bodies.

The simulation doesn't get easier with 4 tho, so 3 body problem is still a good name. Also the planets mass is "almost" negligible compared to that of the suns, so I assume simulating (+ occasional correcting) 3 bodies is already a good approximation.

wzdd16 days ago

It’s not a good approximation because (spoilers) the challenge is to identify the location of the planet relative to the suns, not simply to locate the suns themselves. I too wondered at the title.

PaulHoule17 days ago

I noticed that paper they link to was the first one to find new periodic orbits in the three body problem in a long time which confirms what I’ve believed for a long time which is that nonlinear mechanics is badly underresearched.

sameoldtune17 days ago

There’s a saying in mathematics circles which I’ll butcher here: “everything is either a linear system, reducible to linear systems, or unapproachable.”

Think about how bad we are at analytically solving “simple looking” diff-eqs and the above statement starts to sound too true.

PaulHoule17 days ago

Exactly. Finding a few hundred periodic orbits is a lot of hard work but you don’t have the glory of having “solved” something. Because of that kind of thinking there are many unanswered questions which are ignored because they don’t seem to be part of some masterstroke.

MrCheeze17 days ago

If, like me, you are suddenly curious what would happen if you added a small fourth body:

danAtElodin17 days ago

Nicely done! We were playing with this concept as well, in case it's useful to compare notes together:

mhkeller17 days ago

Adding a svelte three-body animation by rich harris

maxglute17 days ago

Would be neat if there's planet level visualization in Space Engine like in the show where you see the suns whizzing around and enviroment freeze/burn.

adamredwoods17 days ago

This isn't planet level, but shows surface:

jxy17 days ago

Probably try to implement a much better integrator. Some energy preserving higher order integrator should serve a lot better.

melondonkey17 days ago

Looks like Pokemon Jirachi

asdfman12317 days ago

This is nothing like what I remember from the book!

lfmunoz417 days ago


petsfed17 days ago

Maybe its because I made an honest effort of getting a PhD in physics, but I absolutely do not understand this perspective.

Like yes, we have a really hard time talking about just about anything as finite object with physical extent, but jokes about frictionless spherical cows moving in simple harmonic motion started in secondary school. The gaps and shortcomings should not come as a surprise. But most of us also hold devices in our pockets that leverage actual quantum phenomena to function at all (diodes of any stripe only work because of quantum transitions). So while its true that there are a variety of unsolved and potentially unsolvable problems in physics, its a gross misunderstanding to say that it can barely answer simple questions.

I think about the Born-Oppenheimer approximation a lot, as its so obviously a hack to even do the math at all, but it undergirds basically all of solid state physics.

acover17 days ago

What in this post isn't honest?

lfmunoz417 days ago

"not honest" might not be the right phrasing. What I was trying to say is that when learning this stuff I felt like they hid a lot of information from me which later surprised me. But they hid it because they have no answers for it.

One simple example is what happens when you don't consider these as points but instead spheres. Also what happens when the spheres come close? The math starts breaking down, you start seeing infinities. I.e, in reality spheres come close and gravity doesn't go infinity.

mr_mitm17 days ago

You are complaining that you study the simple cases or simplified cases first before you study near unsolvable systems?

Besides, very often the simplified case gets you surprisingly far because the difference between idealized situations and reality is often negligible or at least easily describable - see perturbation theory. The simplified cases are well worth studying.

sfink17 days ago