Rectangle field of view

I have attempted to use the method(Stick to the fov attached to your satellite and use a FieldOfViewDetector (see documentation) combined with an ElevationDetector (see documentation) T) Attempted to combine ElevationDetector and FieldOfViewDetector using Boolean Detector. andCombine (java. util. Collection), but I encountered some issues. I don’t know how to call the relevant methods at all. Can you provide specific code? Can you continue to modify based on my code?
Thank you, best wishes!
Below is my source code:
// Initial state definition : date, orbit
final AbsoluteDate initialDate = new AbsoluteDate(2023, 9, 15, 4, 0, 00.000, TimeScalesFactory.getUTC());
final double mu = 3.986004415e+14; // gravitation coefficient
final Frame inertialFrame = FramesFactory.getEME2000(); // inertial frame for orbit definition
final Vector3D position = new Vector3D(2797914.567, -2288195.171, 6012468.374);
final Vector3D velocity = new Vector3D(-6089.132, 2403.774, 3732.121);
final PVCoordinates pvCoordinates = new PVCoordinates(position, velocity);
final Orbit initialOrbit = new KeplerianOrbit(pvCoordinates, inertialFrame, initialDate, mu);

// Earth and frame
final Frame earthFrame = FramesFactory.getITRF(IERSConventions.IERS_2010, true);
final BodyShape earth = new OneAxisEllipsoid(Constants.WGS84_EARTH_EQUATORIAL_RADIUS,
		Constants.WGS84_EARTH_FLATTENING, earthFrame);

// Station
final double longitude = FastMath.toRadians(75.9797);
final double latitude = FastMath.toRadians(39.4547);
final double altitude = 0.;
final GeodeticPoint station1 = new GeodeticPoint(latitude, longitude, altitude);
final TopocentricFrame sta1Frame = new TopocentricFrame(earth, station1, "喀什");

// Defining rectangular field of view
double halfApertureAlongTrack = FastMath.toRadians(50);
double halfApertureAcrossTrack = FastMath.toRadians(50);

FieldOfView fov = new DoubleDihedraFieldOfView(Vector3D.MINUS_I, // From satellite to body center
		Vector3D.PLUS_K, halfApertureAcrossTrack, // Across track direction
		Vector3D.PLUS_J, halfApertureAlongTrack, // Along track direction
		0); // Angular margin

// Defining attitude provider
AttitudeProvider attitudeProvider = new LofOffset(inertialFrame, LOFType.EQW);

// Defining your propagator and setting up the attitude provider
Propagator propagator = new KeplerianPropagator(initialOrbit);
propagator.setAttitudeProvider(attitudeProvider);

// Event definition
final double maxcheck = 60.0;
final double threshold = 0.001;
final double elevation = FastMath.toRadians(5.0);
final EventDetector sta1Visi = new ElevationDetector(maxcheck, threshold, sta1Frame)
		.withConstantElevation(elevation).withHandler((s, detector, increasing) -> {
			if (increasing) {
				System.out.println("Visibility begins on " + detector.getTopocentricFrame().getName() + " at "
						+ s.getDate());
			} else {
				System.out.println("Visibility ends on " + detector.getTopocentricFrame().getName() + " at "
						+ s.getDate());
			}
			return Action.CONTINUE; // Continue processing the event
		});

// Add event to be detected
propagator.addEventDetector(sta1Visi);

// Propagate from the initial date to the first raising or for the fixed
// duration
final SpacecraftState finalState = propagator.propagate(initialDate.shiftedBy(14400.));

System.out.println(" Final state : " + finalState.getDate().durationFrom(initialDate));

This LOF type seems strange to me for an observation satellite as it rotates about the orbital momentum. If you want the Y axis to be roughly downwards I would suggest to use for example LOFType.LVLH with a π/2 rotation around the 3rd axis (Z) (or maybe -π/2, I let you check).

In this example, you don’t define the FieldOfViewDetector at all. Perhaps you should add this:

  // here we use the trick that TopocentricFrame implements PVCoordinatesProvider
  // so we will track the station within the fov (hoping it is what you want)
 FieldOfViewDetector fovDetector = new FieldOfViewDetector(sta1Frame, fov);

  // here, we must revert the sign of the fov detector, so positive means visible (for the raw detector, negative means visible)
  BooleanDetector bd = BooleanDetector.andCombine(sta1Visi, new NegateDetector(fov));

  // now we can add the boolean detector to the propagator (NOT the raw detectors)
  propagator.addEventDetector(bd);



Something wrong happened. How could I fix it?

Sorry, I should have written new NegateDetector(fovDetector).

Could you please explain the specific meaning of this code?

Yes.
Event detectors in Orekit are a way to detect discrete events occurring during the continuous process of propagation. The events precise location in time may be unknown when the propagator is configured to look for the event, the propagator will discover the event occurrence time on the fly. In order to do that, the event is not defined by this time directly but by a so-called g function that depends on spacecraft state, i.e. time, orbit, attitude, mass and any additional state users may have embedded into the spacecraft state. When this g function changes sign, either going from positive to negative or from negative to positive, then we consider the event occurs. If a function goes from positive to negative, then back to positive, then back to negative, the event will be triggered several time, once at each zero crossing. A simple example would be a g function equal to the Z coordinate of the spacecraft, this event would be triggered at each equator crossing (i.e. when Z changes sign): it is exactly how NodeDetector works.

Here we want to combine two conditions: a ground station must see a satellite taking its orbit into account and at the same time this station must be in the field of view of the satellite taking into account both its orbit and its attitude. The g function will therefore be somewhat complicated. We build it in several steps by combining two underlying functions.

As we want the logical and of two conditions (visibility of satellite from ground and visibility of station from satellite), we use BooleanDetector.andCombine(d1, d2) which combines the g function of detector d1 with the g function of detector d2, assuming the meaning of both g function is positive when the corresponding condition is true and negative when it is false, i.e. this boolean combination does implies a more stringent condition on g functions that traditional detectors: not only does it consider event occurs when the sign changes, but it also implies some meaning to the sign when it is not zero.

So we want detector d1 to represent the condition satellite is visible from ground and we want detector d2 to represent condition station visible from satellite field of view.

We therefore use ElevationDetector as detector d1, because looking at its g function we see it is positive when the satellite is above the specified triggering elevation (5.0° in your case), so it already has the meaning we want. In fact, the g function here is just the satellite elevation minus the specified triggering elevation.

For detector d2, we cannot use directly FieldOfViewDetector because unfortunately the meaning of its g function is not what we want: the function is negative when the target point (here the ground station) is within the field of view and positive when it is outside the field of view: it has exactly the reverse sign of what we want! So we use new NegateDetector(fovDetector) to reverse the sign before performing the and combination.

With this combination, the start and end of visibility events will be triggered only when the two conditions are fulfilled, i.e. when the satellite is above specified elevation and station is within field of view as seen from satellite.

1 Like

OK, thank you. I will give it a try.