1 /* Copyright 2002-2021 CS GROUP
2 * Licensed to CS GROUP (CS) under one or more
3 * contributor license agreements. See the NOTICE file distributed with
4 * this work for additional information regarding copyright ownership.
5 * CS licenses this file to You under the Apache License, Version 2.0
6 * (the "License"); you may not use this file except in compliance with
7 * the License. You may obtain a copy of the License at
8 *
9 * http://www.apache.org/licenses/LICENSE-2.0
10 *
11 * Unless required by applicable law or agreed to in writing, software
12 * distributed under the License is distributed on an "AS IS" BASIS,
13 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14 * See the License for the specific language governing permissions and
15 * limitations under the License.
16 */
17 package org.orekit.estimation.leastsquares;
18
19 import org.hipparchus.linear.MatrixDecomposer;
20 import org.hipparchus.linear.QRDecomposer;
21 import org.hipparchus.optim.nonlinear.vector.leastsquares.LeastSquaresProblem.Evaluation;
22 import org.hipparchus.optim.nonlinear.vector.leastsquares.SequentialGaussNewtonOptimizer;
23 import org.orekit.propagation.conversion.OrbitDeterminationPropagatorBuilder;
24 import org.orekit.propagation.conversion.PropagatorBuilder;
25
26 /**
27 * Sequential least squares estimator for orbit determination.
28 * <p>
29 * When an orbit has already been estimated and new measurements are given, it is not efficient
30 * to re-optimize the whole problem. Only considering the new measures while optimizing
31 * will neither give good results as the old measurements will not be taken into account.
32 * Thus, a sequential estimator is used to estimate the orbit, which uses the old results
33 * of the estimation and the new measurements.
34 * <p>
35 * In order to perform a sequential optimization, the user must configure a
36 * {@link org.hipparchus.optim.nonlinear.vector.leastsquares.SequentialGaussNewtonOptimizer SequentialGaussNewtonOptimizer}.
37 * Depending if its input data are an empty {@link Evaluation}, a complete <code>Evaluation</code>
38 * or an a priori state and covariance, different configuration are possible.
39 * <p>
40 * <b>1. No input data from a previous estimation</b>
41 * <p>
42 * Then, the {@link SequentialBatchLSEstimator} can be used like a {@link BatchLSEstimator}
43 * to perform the estimation. The user can initialize the <code>SequentialGaussNewtonOptimizer</code>
44 * using the default constructor.
45 * <p>
46 * <code>final SequentialGaussNewtonOptimizer optimizer = new SequentialGaussNewtonOptimizer();</code>
47 * <p>
48 * By default, a {@link QRDecomposer} is used as decomposition algorithm. In addition, normal
49 * equations are not form. It is possible to update these two default configurations by using:
50 * <ul>
51 * <li>{@link org.hipparchus.optim.nonlinear.vector.leastsquares.SequentialGaussNewtonOptimizer#withDecomposer(MatrixDecomposer) withDecomposer} method:
52 * <code>optimizer.withDecomposer(newDecomposer);</code>
53 * </li>
54 * <li>{@link org.hipparchus.optim.nonlinear.vector.leastsquares.SequentialGaussNewtonOptimizer#withFormNormalEquations(boolean) withFormNormalEquations} method:
55 * <code>optimizer.withFormNormalEquations(newFormNormalEquations);</code>
56 * </li>
57 * </ul>
58 * <p>
59 * <b>2. Initialization using a previous <code>Evalutation</code></b>
60 * <p>
61 * In this situation, it is recommended to use the second constructor of the optimizer class.
62 * <p>
63 * <code>final SequentialGaussNewtonOptimizer optimizer = new SequentialGaussNewtonOptimizer(decomposer,
64 * formNormalEquations,
65 * evaluation);
66 * </code>
67 * <p>
68 * Using this constructor, the user can directly configure the MatrixDecomposer and set the flag for normal equations
69 * without calling the two previous presented methods.
70 * <p>
71 * <i>Note:</i> This constructor can also be used to perform the initialization of <b>1.</b>
72 * In this case, the <code>Evaluation evaluation</code> is <code>null</code>.
73 * <p>
74 * <b>3. Initialization using an a priori estimated state and covariance</b>
75 * <p>
76 * These situation is a classical satellite operation need. Indeed, a classical action is to use
77 * the results of a previous orbit determination (estimated state and covariance) performed a day before,
78 * to improve the initialization and the results of an orbit determination performed the current day.
79 * In this situation, the user can initialize the <code>SequentialGaussNewtonOptimizer</code>
80 * using the default constructor.
81 * <p>
82 * <code>final SequentialGaussNewtonOptimizer optimizer = new SequentialGaussNewtonOptimizer();</code>
83 * <p>
84 * The MatrixDecomposer and the flag about normal equations can again be updated using the two previous
85 * presented methods. The a priori state and covariance matrix can be set using:
86 * <ul>
87 * <li>{@link org.hipparchus.optim.nonlinear.vector.leastsquares.SequentialGaussNewtonOptimizer#withAPrioriData(org.hipparchus.linear.RealVector, org.hipparchus.linear.RealMatrix) withAPrioriData} method:
88 * <code>optimizer.withAPrioriData(aPrioriState, aPrioriCovariance);</code>
89 * </li>
90 * </ul>
91 * @author Julie Bayard
92 * @since 11.0
93 */
94 public class SequentialBatchLSEstimator extends BatchLSEstimator {
95
96 /**
97 * Simple constructor.
98 * <p>
99 * If multiple {@link PropagatorBuilder propagator builders} are set up, the
100 * orbits of several spacecrafts will be used simultaneously. This is useful
101 * if the propagators share some model or measurements parameters (typically
102 * pole motion, prime meridian correction or ground stations positions).
103 * </p>
104 * <p>
105 * Setting up multiple {@link PropagatorBuilder propagator builders} is also
106 * useful when inter-satellite measurements are used, even if only one of
107 * the orbit is estimated and the other ones are fixed. This is typically
108 * used when very high accuracy GNSS measurements are needed and the
109 * navigation bulletins are not considered accurate enough and the
110 * navigation constellation must be propagated numerically.
111 * </p>
112 * <p>
113 * The solver used for sequential least squares problem is a
114 * {@link org.hipparchus.optim.nonlinear.vector.leastsquares.SequentialGaussNewtonOptimizer
115 * sequential Gauss Newton optimizer}.
116 * Details about how initialize it are given in the class JavaDoc.
117 * </p>
118 *
119 * @param sequentialOptimizer solver for sequential least squares problem
120 * @param propagatorBuilder builders to use for propagation.
121 */
122 public SequentialBatchLSEstimator(final SequentialGaussNewtonOptimizer sequentialOptimizer,
123 final OrbitDeterminationPropagatorBuilder... propagatorBuilder) {
124 super(sequentialOptimizer, propagatorBuilder);
125 }
126
127 }