Date
1  13 of 13
{MPML} Monte Carlo uncertainty estimation
David Tholen
I'm pondering adding a Monte Carlo routine to my orbit determinationDepends on what you're trying to accomplish. This approach won't necessarily help you out with double solutions. You might just wind up with a fuzzified set of solutions near one of the two solutions, and never find the other minimum in chisquared space. This would especially be the case if your starting solution was the same in each case.


Bill J. Gray
Jim, David, thanks for your comments...
Jim, I do compute a covariance matrix (in the state vector) as a byproduct of the "usual" orbit determination process. Converting that to a positional uncertainty would be nice and reasonably fast, but it would take an ugly chunk of mathematics to do it. I'm holding off on that task. Besides, even if I did this, life is full of nonlinear behavior (it was Rob McNaught's comments on this list about the recovery of 1978 CA that got me thinking about tackling this sort of problem.) We run into problems all the time where an object has the gall to remain missing for a long time, or only gets observed for a very short time, or insists on passing by a perturber that throws a kink into the solution. I'll take a more thorough look at the Milani papers; I ought to be able to get something useful out of them, though following the math is going to be a problem in some places. David, I'd planned to handle double/multiple solutions with double or multiple runs. In my outlined algorithm, I skipped step (0): determine an initial, default orbit. Subsequent orbits would be determined by "tweaking" this orbit, _not_ by going back to scratch. So you'd solve with orbit #1, get a particular uncertainty region, then with #2, and get a different uncertainty region. Alternatively/in addition, I may make some use of your scheme for handling short orbital arcs (including those with a mere two observations, or more commonly, those where a full sixparameter orbit fit won't work.)  Bill


E. L. G. Bowell
Bill:
You are essentially describing a technique that has already been published under the name of statistical ranging. The primary reference is Virtanen et al. (Icarus 154, 412, 2001), and there is additional work in Muinonen et al. (Celest. Mech. and Dyn. Astron. 81, 93, 2001). There is an application to TNOs by Virtanen et al. (Icarus, in press), and a URL on same at http://asteroid.lowell.edu/cgibin/virtanen/tnoeph. There will also be a description of statistical ranging and other recent orbit methods in Bowell et al. (in Asteroids III, U. Arizona Press, 2003). Cheers...Ted XeGroupsReturn:sentto180659194721044413207Edward.Bowell=lowell.edu@returns.groups.yahoo.com XSender: pluto@projectpluto.comPlanetary Society (http://www.planetary.org) be proceeded by the following attribution:please visit: MPML Home page: http://www.bitnik.com/mp


Brian D. Warner
Ted,
There will also be a description of statistical ranging and other recent orbit methods in Bowell et al. (in Asteroids III, U. Arizona Press, 2003). <<<< A book that is very much MIA! <g> Amazon took my order in October 2002 saying it would deliver in mid December, 2002. Still waiting. Do you (or anyone on line who is an author) have a more uptodate delivery time? Clear Skies, Brian Warner Palmer Divide Observatory Colorado Springs, CO


Alain Maury <amaury@...>
Why don't you download it (less than 100 Mo ) or just the section you want while it is available...
toggle quoted messageShow quoted text
http://www.lpi.usra.edu/books/AsteroidsIII/download.html Alain brianw_mpo wrote:
Ted,


dixon_lascruce
FEB 03
toggle quoted messageShow quoted text
"brianw_mpo " wrote:


Brian D. Warner
Alain,
Why don't you download it (less than 100 Mo ) or just the section you want while it is available... <<<< As I understood, that site was really meant for authors only, and not for people to grab a free version ala stealing songs via Napster <g>. Out of principle, I don't mind paying, despite the $110 price. Besides, I already have hard cover versions of I and II so getting the real thing will compliment the set. Clear Skies, Brian Warner


Brian D. Warner
David,
FEB 03 <<< It now being three days past that deadline  WHERE'S MY BOOK?? <g> Clear Skies, Brian Warner Palmer Divide Observatory (IAU 716) 17995 Bakers Farm Rd. Colorado Springs, CO 80908 http://www.MinorPlanetObserver.com Collaborative Asteroid Lightcurve Link (CALL) http://www.MinorPlanetObserver.com/astlc/default.htm


Alain Maury <amaury@...>
brianw_mpo wrote:
Alain,I will buy the book when it will become available. I have in the past gotten things from Napster and a few others, and have generally bought the CDs of the interesting ones, deleted the others, when I could (for example it is hard to get french songs in Chile, and I could hear a CD and decide to buy it on my next trip). I didn't feel like stealing, but trying. I just meant you could upload Ted's article while waiting for amazon to send you a copy of the book. alain Out of principle, I don't mind paying, despite the $110 price. Besides, I already have hard cover versions of I and II so getting the real thing will compliment the set.


Steve Chesley
Bill,
toggle quoted messageShow quoted text
There are roughly five means of mapping an orbital solution with uncertainty to a given time, each with a fairly well defined realm of utility. 1) Linear covariance mapping. Take the covariance that you get (or should have gotten!) from the least squares orbit solution and map this to the time and reference frame of your choice via the state transition matrix, which is usually obtained by integrating the variational equations but can also be computed with finite differences. This is not for the faint of heart when it comes to impenetrable jargon, but is really not as complicated as it might sound. Linear methods are appropriate when the uncertainty is "small." Typically, when uncertainties become on the order of a degree on the sky or become a substantial fraction of an AU in space this method falls apart. This failure can usually be traced to a weak orbit, a long propagation, a very deep close approach, or some combination of these. Multiopposition and radarastrometry orbits are almost always amenable to this approach. 2) Orbital Monte Carlo. When the orbit itself is fairly good, meaning that the uncertainty region is small enough that it really looks like an ellipsoid (not a banananoid) in orbital element space, but the propagation is substantially nonlinear then Monte Carlo (MC) sampling in element space is the way to go. This amounts to adding noise that is consistent with the covariance matrix to the nominal orbital elements. Do this many times and you get an ellipsoidal cloud of points in space around the time of the observations. Andrea Milani likes to call these "Virtual Asteroids" because the real asteroid could be represented by any of the Monte Carlo samples. Now if you propagate all of the MC points to the time of interest the cloud will eventually deform to look like a banana after many revolutions or like a corkscrew if there is a close planetary encounter. Any nonlinearities stemming from the propagation (close encounters, Keplerian shear, etc.) will be handled properly by this method. This method is very simple to implement, but requires a good deal of computer horsepower relative to the linear method. For NEAs orbital MC sampling is usually appropriate when the observed arc is a few weeks or more. 3) Observational Monte Carlo. When the observed arc is very short the uncertainty in the original orbit determination is so large that you cannot assume that the ellipsoid represented by the covariance adequately represents the true uncertainty. Monte Carlo sampling can still be done, but this time noise must be added to each of the observations and a new orbit computed based upon the revised observations. This new orbit becomes a Virtual Asteroid as above. The drawback is that you have to solve the least squares problem for every single MC orbit with this approach, which can be time consuming, and so the MC sampling process is far slower. This can be a big deal if you are running many millions of orbits, but in practice the time spent propagating each sample is long relative to the MC sampling time, and so the propagation time is what limits your ability to take a lot of samples. This method is what Bill describes and is, I understand, already a part of John Rogers' CAA software. It is generally suitable for NEAs with at least a few days of observations if the least squares problem is convergent. If there are distant alternate solutions, as often happens for shortarc objects discovered nearsun, this method is _unlikely_ to reveal those alternate solutions. 4) Statistical Ranging. This is really the only reliable means of computing orbits with very short arcs, ranging from a few minutes to a few days. This approach is also Monte Carlo in style, but it randomly samples two observations from the available set and selects two random topocentric distances at the observation times. From two obs and two distances you get an orbit, and that's your Virtual Asteroid. There are a host of variations on this method: You can also add noise to your sampled observations if you like. Dave Tholen and Rob Whiteley, working independently from Virtanen et al., have implemented a method that fits an orbit to all the available observations with the topocentric distance constraints applied. Or something like that. Statistical ranging _will_ reveal alternate solutions and will give robust uncertainty regions, which in some cases can be really wild looking. 5) Multiple Solutions. This is the method popularized by, if not invented by, Andrea Milani. It maps the spine of the elongated uncertainty region at epoch, and so it is a onedimensional sampling, which substantially cuts the CPU requirements. But it is perhaps the most complicated of the methods, and I'm starting to realize this message is going far too long, and so I'll only say that this method is at the core of both of the automatic impact monitoring systems currently in operation (Sentry & NEODyS). Objectively this approach has a pretty limited utility due to its complexity. Each of the above methods has a fairly specific region where it is the most appropriate, but there is still a good amount of overlap. Methods 14 can be viewed as providing increasing power at the expense of simplicity and speed. Using statistical ranging to compute uncertainties on multiopposition orbits would be crazy, not unlike using a sledge hammer to drive a brad. Elegance requires that you use the most simple method that is suitable, but, frankly, method 3) will work reasonably well for virtually all cases. And, yes, Bill, it's really that simple! 6) Did I say five? Well, there is also the semilinear method. But you definitely don't want to go there. It's more complicated than multiple solutions! I'm just adding this to keep out of trouble with Andrea Milani. ;) Steve Chesley  Navigation & Mission Design Section, MS 301150 Jet Propulsion Laboratory Pasadena, California 91109 (818) 3549615, Fax: 3936388 E. L. G. Bowell wrote:
Bill:


David Tholen
4) Statistical Ranging. This is really the only reliable means ofA good question is whether our technique can be accurately described as being Monte Carlo in style. I've always associated randomness with the term. When I add noise to a set of observations, I generate random numbers, so that approach would certainly qualify for the term Monte Carlo. However, KNOBS does not use any random numbers at all. It's a methodical grid search, originally constrained by the hyperbolic limit (and one can apply a prograde constraint as well). As more observations are acquired, the constraint comes from statistics, but the topocentric distance and radial velocity are still taken from a grid of values rather than from generating random numbers over a specified range. What do you think, does it still qualify as being Monte Carlo in style?


Steve Chesley
David Tholen wrote:
Of course, raster sampling isn't random, and so it is not actually Monte Carlo. If memory serves, Virtanen et al. sample the topocentric range and range rate with uniformly distributed samples across some rectangular region, and so this would fall under the MC umbrella. The selection of the sampled region is iteratively refined based on the location of previous "acceptable" samples, much in the same way as you refine your raster search based on the results of the previous raster search. Thus the differences between the two methods are not really substantial. I should point out that Rob McNaught also independently developed a method along the same lines, at around the same time as the others.4) Statistical Ranging. This is really the only reliable means ofA good question is whether our technique can be accurately described as Steve


Robert McNaught <rmn@...>
search. Thus the differences between the two methods are not reallyJust for the record, I wrote PANGLOSS in around 1986 on a Commodore VIC20! Due to the extremely slow speed of the Commodore, I did little with it till it was rewritten in FORTRAN in the early 1990's and used at the UK Schmidt during Duncan Steel's AANEAS program. Cheers, Rob

