I'm trying to model a satellite sensor FOV with the
DihedralFieldOfViewDetector. I'm assuming this would be the right
approach. I've attached the complete compile-able source to this
email in case you want to take a look or run it.
I started out with the test code for DihedralFieldOfViewDetector:
https://www.orekit.org/forge/projects/orekit/repository/revisions/master/entry/src/test/java/org/orekit/propagation/events/DihedralFieldOfViewDetectorTest.java
And started modifying it to fit my need. I changed the propagator to
use TLEPropagator with the appropriate TLE's for the day in
question. (which I know works well because I'm using the same code
to calculate satellite subtracks and line of sight visibility
schedules and it's working perfectly.)
Now I get stuck in the parameters to feed the DihedralFieldOfViewDetector.
I changed:
final PVCoordinatesProvider sunPV = CelestialBodyFactory.getSun();
to
GeodeticPoint point = new
GeodeticPoint(FastMath.toRadians(25.61379),
FastMath.toRadians(-80.38402), 0.0);
TopocentricFrame stationFrame = new
TopocentricFrame(earth, point, "CSTARS Ground Station");
final PVCoordinatesProvider aoiTarget = stationFrame;
So that it detects when the FOV encounters a target on earth and not the sun.
I left the Vector3D parameters the same because I don't know enough
to know what to change.
And I changed the aperture1 & aperture2 variables like so:
final double aperture1 = FastMath.toRadians(20);
final double aperture2 = FastMath.toRadians(45);
Because the sensor I'm trying to model has an incidence angle of
20-45 degrees.
So I'm pretty sure I could have multiple things wrong here. I'd
appreciate if anyone can tell me what that is.