[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Orekit Users] FieldOfView usage/help
aaron.rich@gmail.com a écrit :
Hi,
Hi Aaron,
I'm new to using orekit and am hoping someone can help me understand how to
use/define field of views.
I'm trying to calculate when a satellite will be within a ground sites field
of view. The GroundFieldOfViewDetector seems to be perfect for this but I
can't figure out how to define the Field of View for the site. I
tried looking
at the test case but wasn't able to fully understand it.
If the field of view is suppose to be centered at an azimuth 45 degrees (0 ==
East) with a horizontal half angle of 60 degrees (so total of 120) and a
elevation centered at 60 degrees above the horizon with a half angle of 30
degrees, how would I build that Field of View to use with the
GroundFieldOfViewDetector?
Here are two different suggestions, depending on the real shape you want to
use for your field of view. Here, all vectors are defined in the topocentric
frame (X towards East, Y towards North, Z towards Zenith). I assumed in this
code that since you start counting from East, you also count
positively towards
North, which is *not* the classical azimuth definition (which
considers that 0°
is North and 90° is East). If you can, I strongly suggest you adopt the
conventional azimuth definition instead of the one you presented, as it will
create problems if you need to exchange data with more conventional systems.
First suggestion, assuming that your field of view really always includes the
zenith as its top point, and is basically a triangular shape exending
from zenith
to 30° (i.e. 60° - 30°):
// canonical parameters of the field of view
final double centerAzimuth = FastMath.toRadians(45.0);
final double azimuthHalfSpan = FastMath.toRadians(60.0);
final double minElevation = FastMath.toRadians(30.0);
// convert the canonical parameters to construction parameters
final double cMinus = FastMath.cos(centerAzimuth -
azimuthHalfSpan);
final double sMinus = FastMath.sin(centerAzimuth -
azimuthHalfSpan);
final double cPlus = FastMath.cos(centerAzimuth +
azimuthHalfSpan);
final double sPlus = FastMath.sin(centerAzimuth +
azimuthHalfSpan);
final double cosEl = FastMath.cos(minElevation);
final double sinEl = FastMath.sin(minElevation);
final double hyperplaneThickness = 1.0e-12;
final SphericalPolygonsSet sps =
new SphericalPolygonsSet(hyperplaneThickness,
S2Point.PLUS_K,
new S2Point(new
Vector3D(cMinus * cosEl, sMinus * cosEl, sinEl)),
new S2Point(new
Vector3D(cPlus * cosEl, sPlus * cosEl, sinEl)));
final FieldOfView fov = new FieldOfView(sps, 0.0);
The points here are the vertices of the fov, which is a triangle here.
Beware that hare I
assumed maxElevation was exactly 90°, hence the single point
S2Point.PLUS_K. If you need
maxElevation < 90°, you should set up a four vertices trapezoidal
shape. However, do not
try to set up a trapezoidal shape in all cases and later set the
maxElevation to 90°, otherwise
the topmost edge would degenerate to a zero size edge and this will not work.
Beware that the order of the three points is important. If you reverse
the ordering, you get the
complement of the region! Both the ordering of the points and the
coordinates computation
have been made compliant with an azimuth counted from East to North.
Second suggestion, assuming that your field of view really is a rectangular
shape (more accurately a double dihedra) and that the half angles are really
computed from the center (i.e. at 60° elevation) and not from the horizon:
// canonical parameters of the field of view
final double centerAzimuth = FastMath.toRadians(45.0);
final double azimuthHalfSpan = FastMath.toRadians(60.0);
final double centerElevation = FastMath.toRadians(50.0);
final double elevationHalfSpan = FastMath.toRadians(30.0);
// convert the canonical parameters to construction parameters
// dihedra 1 is considered to be the azimuth span
// dihedra 2 is considered to be the elevation span
final double cosAz = FastMath.cos(centerAzimuth);
final double sinAz = FastMath.sin(centerAzimuth);
final double cosEl = FastMath.cos(centerElevation);
final double sinEl = FastMath.sin(centerElevation);
final FieldOfView fov =
new FieldOfView(new Vector3D( cosAz * cosEl, sinAz * cosEl, sinEl),
new Vector3D(-cosAz * sinEl, -sinAz * sinEl,
cosEl), azimuthHalfSpan,
new Vector3D(-sinAz, cosAz, 0.0), elevationHalfSpan,
0.0);
The first vector is the center of the field of view, i.e. it points
towards North-East
and at 60° elevation. The second vector is the axis of the azimuth
span, i.e. it is a
vector that would be vertical if elevation were 0°, but that "leans
backward" to look
up at 60°. The third vector is the axis of the elevation span. Beware
that with this
constructor, the 60° "azimuth" span is not really in azimuth since the
fov looks
largely above horizon.
How would I also put a min and max range limit on
the Field of View?
You should create first a RangeIntervalDetector, centered on the same
topocentric frame
(we don't have this detector, but it would be a nice addition to
Orekit, we may add if
you want). Then you should combine the GroundFieldOfViewDetector and
the RangeIntervalDetector
using BooleanDetector.andCombine(groundFovDetect,
rangIntervalDetedtor) and use the
combined detector in your propagator. This would ensure that events
are triggered when
the propagated object enters or leaves the curved slab defined by
these two detectors.
The object may for example enter from the side (really triggered by
the GroundFieldOfViewDetector)
and exit from the rear (really triggered by the RangeIntervalDetector).
I previously had tried using a FixStepHandler before finding the
GroundFieldOfViewDetector. The handler code was this:
public void handleStep(SpacecraftState spacecraftState, boolean b) throws
OrekitException {
double elevation =
Math.toDegrees(frame.getElevation(spacecraftState.getPVCoordinates().getPosition(),
spacecraftState.getFrame(),
spacecraftState.getDate()));
double azimuth =
Math.toDegrees(frame.getAzimuth(spacecraftState.getPVCoordinates().getPosition(),
spacecraftState.getFrame(),
spacecraftState.getDate())
);
double range =
frame.getRange(spacecraftState.getPVCoordinates().getPosition(),
spacecraftState.getFrame(),
spacecraftState.getDate());
//if none of the checks are configured to happen, want to default to
save point.
//else, check bounds for each parameter to determine if sat is within
limits.
boolean save_state = true;
if (check_azimuth && (azimuth < min_az || azimuth > max_az))
{
save_state = false;
}
if (check_elevation && (elevation < min_elevation || elevation >
max_elevation))
{
save_state = false;
}
if (check_range && (range < min_range || range > max_range))
{
save_state = false;
}
//if passed all checks, save
if (save_state){
...
frame is a TopocentricFrame.
Beware that using this code, you use the conventional definition of
azimuth, which
is 0° on North, 90° on East, 180° on South and 270° on West. The fov
shape corresponding
to this code is the first suggestion above.
hope this helps,
Luc
Thanks for any help or insight getting pointed in the right direction.