Simulations of observations

Hi guys!

I need to construct and carry out some simulations. Let’s say I have widefield camera and want to carry out survey-style observations. I know geo-location of my camera,FOV of camera (width and height in degrees), direction in which camera is pointing in ALT and AZ. With such configuration I want to take pictures, let’s say, every 30s for a week or two and note NORAD ID, UTC time, ALT and AZ of every caught satellite for a week or two. Can Orekit support in a convienient manner such simulations given TLE with satellites I’m intrested in?

Hi @jlipinski,

Yes it can !
I’ll give you pointers on how to do it. Don’t hesitate to come back to us if you need more details.

First an example, the the tutorial MeasurementGenerator by Bryan Cazabonne.

Then a small roadmap, it follows the logic of the run method in Bryan tutorial.

  • Build a TopocentricFrame from your camera position in ITRF frame;
  • Also create a GroundStation object from the TopocentricFrame
  • Build a measurement Generator: final Generator generator = new Generator();
  • Get the list of your TLEs, try to have them around the same date (it’s better for the performances of the MultiSatellitePropagator)
  • For each TLE:
    • Set up a TLEPropagator from them → this is equivalent to the buildPropagator method in Bryan tutorial;
    • Add it to the Generator and store the ObservableSatellite in a variable;
    • Create an AngularAzElBuilder in the same fashion as the getBuilder function in the tutorial;
    • Create an EventBasedScheduler and add it to the Generator (see l. 208 of the tutorial) with inputs:
      • MeasurementBuilder: the AngularAzElBuilder
      • DatesSelector: a FixedStepSelector with 30s step
      • ObservableSatellite: the one created earlier
      • EventDetector: a GroundFieldOfViewDetector with the TopoCentricFrame and the DoubleDihedralFieldOfView of your camera. Use withHandler(new ContinueOnEvent<>()) otherwise the propagation will stop when the first satellite leaves the FOV;
      • SignSemantic.FEASIBLE_MEASUREMENT_WHEN_NEGATIVE: see the Javadoc of the g(...) method in DoubleDihedralFieldOfView for more explanation on this.
  • Then, out of the TLEs loop, call final SortedSet<ObservedMeasurement<?>> measurements = generator.generate(t0, t1); between your chosen dates t0 and t1 to get the list of generated measurements

One flaw of this code is that you can setup only one FOV direction for all measurements.
You could loop on your different directions and do the generation for each camera direction.
But the best would be to create a new GroundFieldOfViewDetector with a TimeSpanMap of FieldOfView representing the different directions of your camera with respect to time.

I wrote this quite quickly so I’m sorry if it’s not super clear.

Hope this helps,


Thanks for quick answer! I guess there is not much diffrence in coding this in Python other than syntactic?

I think so but I’m no Python expert.

Got pretty much all the way but got stuck near the end. I don’t really understand how to build EventDetector. So far I have:

  • A TLE for testing (random starlink):
tle_line1 = "1 53700U 22107A   22262.25002315 -.00003400  00000-0 -19788-4 0  9995"
tle_line2 = "2 53700  53.2141 203.0545 0001141 114.0473 238.9866 15.73231113  2358"
mytle = TLE(tle_line1,tle_line2)
  • Topocentric frame and ground stationd declared as:
ITRF = FramesFactory.getITRF(IERSConventions.IERS_2010, True)
earth = OneAxisEllipsoid(Constants.WGS84_EARTH_EQUATORIAL_RADIUS, 
longitude = radians(22.61861)
latitude  = radians(37.97194)
altitude  = 930.0
station = GeodeticPoint(latitude, longitude, altitude)
stationFrame = TopocentricFrame(earth, station, "Kryoneri")
groundStation = GroundStation(station_frame)
  • Propagator, generator, ObservableSatellite and AngularAzElBuilder:
propagator = TLEPropagator.selectExtrapolator(mytle)
generator = Generator()
sat = generator.addPropagator(propagator)
azElBuilder = AngularAzElBuilder(None, groundStation, [0.,0.], [1.,1.], sat)

Now I can’t really figure out how to define DoubleDihedralFieldOfView. Lets say I have a camera with X degrees width and Y degrees height with its center pointing at 180 degrees AZ and 45 degrees EL. Can’t really figure out how to transform this coordinates and numbers in horizontal reference system to Vector3D.

I suspect the rest of the code will look like this:

fov = DoubleDihedraFieldOfView(/*need some help here*/)
fovGroundDetector = GroundFieldOfViewDetector(stationFrame, fov)
scheduler = EventBasedScheduler(azElBuilder, FixedStepSelector(30.0), 
                                sat, fovGroundDetector,

Additionally, is this method detects only visible satellites, i.e. not in Earth umbra or do I need to implement another method to detect if satellite is in umbra. Moreover is there a simple way to detect only after civil dusk and before civil dawn, like check if in location of detector Sun is more than 6 degrees below the horizon?

Hi @jlipinski,

TopocentricFrame are defined with:

  • X axis in the local horizontal plane (normal to zenith direction) and following the local parallel towards East;
  • Y axis in the horizontal plane (normal to zenith direction) and following the local meridian towards North;
  • Z axis towards Zenith direction.

By convention azimuth is counted clockwise from the North direction, so positive towards the East (see TopocentricFrame#getAzimuth).

Given the arguments of the constructor of DoubleDihedralFieldOfView:

  • center vector, in your case (AZ, EL) = (180°, 45°) should give you : center = [0, -\sqrt(2)/2, \sqrt(2)/2]
  • axis1 and axis2: depends on how your sensor is rotated around the direction of the FOV center.
    If you want one of the side to be parallel to the local horizontal, you can choose axis1 = [1, 0, 0]
    Axis2 will be perpendicular to the 2 previous vector: axis2 = [0, \sqrt(2)/2, \sqrt(2)/2]
  • halfAperture1, halfAperture2: half aperture (in radians) of your FOV along each axis
  • margin can be set to 0.

You have to add another detector for this.

The GroundAtNightDetector with CIVIL_DAWN_DUSK_ELEVATION (you can ignore atmospheric refraction to begin with).
The g detection function is positive when the ground is at night:

CelestialBody sun = CelestialBodyFactory.getSun();
GroundAtNightDetector nightDetector = new GroundAtNightDetector(stationFrame, sun, GroundAtNightDetector.CIVIL_DAWN_DUSK_ELEVATION, null);

Then you want to combine both detectors, using a BooleanDetector.
The andCombine g function is positive if and only if the g functions of all detectors are positive.
However the GoundFieldOfView detector g function is negative when the object is in the FOV, you’ll need to negate that.
In the end, you will have:

BooleanDetector fovAtNightDetector = BooleanDetector.andCombine(BooleanDetector.notCombine(fovGroundDetector), nightDetector);

There is probably a more elegant way to do this using an EventEnablingPredicateFilter but I’m not sure how to do it right now, I’ll think about it when I have more time.

Hope this helps,

So I clearly did something wrong. With some help from stellarium I found a satellite that should be visible in my FoV, from my station location, yet I receive not data from generator. Below you can find sinppets from my code, any help will be much appreciated.

  • Setting up TLE (UME 2 (ISS-B) satellite. According to this TLE and stellarium it should be visible from Kryoneri at around 18:04 UTC, 2022.09.19, for almost a minute for detector set up as shown few cells below):
tle_line1 = "1 10674U 78018A   22261.90676743 -.00000042  00000-0  12781-4 0  9997"
tle_line2 = "2 10674  69.3613 128.1060 0161631 151.4638 222.0348 13.43632735186960"
mytle = TLE(tle_line1,tle_line2)
  • Setting up station:
ITRF = FramesFactory.getITRF(IERSConventions.IERS_2010, True)
earth = OneAxisEllipsoid(Constants.WGS84_EARTH_EQUATORIAL_RADIUS, 
sun = CelestialBodyFactory.getSun()
longitude = radians(22.61861)
latitude  = radians(37.97194)
altitude  = 930.0
station = GeodeticPoint(latitude, longitude, altitude)
stationFrame = TopocentricFrame(earth, station, "Kryoneri")
groundStation = GroundStation(stationFrame)
  • Setting up Generator, propagotr and measurement builder:
propagator = TLEPropagator.selectExtrapolator(mytle)
generator = Generator()
sat = generator.addPropagator(propagator)
AzElBuilder = AngularAzElBuilder(None, groundStation, [0.,0.], [1.,1.], sat)
  • Function to convert azimuth (measured clockwise from Y axis) and elevation (measured from reference plane towards zenith) to cartesian:
def sph2car(phi, theta, r=1.):
    phi = FastMath.toRadians(90.-phi)
    theta = FastMath.toRadians(theta)
    return [
        r * FastMath.cos(phi) * FastMath.cos(theta),
        r * FastMath.sin(phi) * FastMath.cos(theta),
        r * FastMath.sin(theta)
  • Setting up FoV and FoV ground detector:
phi = 180.
theta = 45.

height = 10.
width = 15.

x,y,z = sph2car(phi,theta)
center = Vector3D([x,y,z])
axis1 = Vector3D([1.0,0.0,0.0])
axis2 = Vector3D([0.0,FastMath.sqrt(2.0)/2.0,FastMath.sqrt(2.0)/2.0])
ha1 = FastMath.toRadians(width)/2.
ha2 = FastMath.toRadians(height)/2.

fov = DoubleDihedraFieldOfView(center, axis1, ha1, axis2, ha2, 0.0)
fovGroundDetector = GroundFieldOfViewDetector(stationFrame, fov).withHandler(ContinueOnEvent())
  • Setting up night detector:
nightDetector = GroundAtNightDetector(
    stationFrame, sun, GroundAtNightDetector.CIVIL_DAWN_DUSK_ELEVATION, None
  • Setting up eclipse detector:
eclipseDetector = EclipseDetector(sun, Constants.SUN_RADIUS, earth).withPenumbra().withHandler(ContinueOnEvent())
  • Combining detetctors with boolean detector:
fovAtNightDetector = BooleanDetector.andCombine([
    BooleanDetector.notCombine(fovGroundDetector), nightDetector, eclipseDetector
  • Setting up scheduler:
scheduler = EventBasedScheduler(AzElBuilder, FixedStepSelector(10.0, TimeScalesFactory.getUTC()), 
                                generator.getPropagator(sat), fovAtNightDetector,
  • Getting the list of measurements:
t0 = AbsoluteDate(2022,9,19,18,0,0.0, TimeScalesFactory.getUTC())
t1 = AbsoluteDate(2022,9,19,18,10,0.0, TimeScalesFactory.getUTC())
data = generator.generate(t0,t1)

I don’t think this works properly. It gives me SortedSet of size 61, while this satellite should be visible for a minute or less, so I think the size should be closer to 6 with 10s step size. Additionally I’m not sure how parse this ObservedMeasurements to get ALT and AZ (or X, Y, Z in my stationFrame). I know that I can derive date of detection using geteDate() method and I can get NORAD ID from TLE.

Below is screenshot from stellarium showing that this satellite (UME 2) should be detected:

I think the fov axes are wrong.
Try something like (using Java syntax):

Vector3D axis2 = Vector3D.crossProduct(Vector3D.PLUS_K, center).normalize();
Vector3D axis1 = Vector3D.crossProduct(axis2, center).normalize();

Unfortunately this didn’t help. I still get measurements for every time step as shown below:

for measurements in data:
    castedMeasurements = AngularAzEl.cast_(measurements)

Hi @jlipinski,

@luc code actually works and gives the same values I gave you. But it’s better because it calculates them automatically.

There are two issues here:

  1. The maximum check interval of your DoubleDihedraFieldOfView is too big.
    It is not manually set, so it uses the default AbstractDetector.DEFAULT_MAXCHECK of 600s (10min). Since you’re looking for an event entry/departure that lasts around a minute, the event is transparent and not seen.
    So I added .withMaxCheck(1.) to the first detector (checking every second);

  2. We made it so that the event detection we want is positive:

But afterwards we tell the generator to check for FEASIBLE_MEASUREMENT_WHEN_NEGATIVE.
So before, you were actually capturing the inverse event compared to what you wanted :wink:

With this I found the following events (date: az / el)

2022-09-19T18:04:30.000Z: 173.692 / 47.313
2022-09-19T18:04:40.000Z: 172.517 / 44.805
2022-09-19T18:04:50.000Z: 171.510 / 42.429
2022-09-19T18:05:00.000Z: 170.638 / 40.180

Looks better no ?

For the previous prints I used the following Java code:

for (final ObservedMeasurement<?> meas : data) {
            final AngularAzEl azel = (AngularAzEl) meas; // This is Java casting, equivalent to "AngularAzEl.cast_(meas)" in Python
            final double az = azel.getObservedValue()[0];
            final double el = azel.getObservedValue()[1];
            final AbsoluteDate date = azel.getDate();
            System.out.format(Locale.US,  "%s: %.3f / %.3f%n", date, FastMath.toDegrees(az), FastMath.toDegrees(el));

AngularAzel do not contain range nor altitude measurements.

Hope this helps,

Thank you very much @luc, @MaximeJ . Now it work as expected!

I have one more question though. I have two detectors located in diffrent locations. Let’s say (to continue example from this thread) one is in Kryoneri, the other is somewhere in Africa (same longitude, latitude around 20 degrees). They both look in each others direction (Kryoneri one is 180 AZ, 45 EL, African one is 0 AZ, 45 EL). Is it ok to just add to existing BooleanDetector another GroundFieldOfViewDetector and GroundAtNightDetector set up for Africas detector? I’m ok with getting AZ and EL coordinates from only one of them, just need to check if a satellite is visible from both places.

Hi @jlipinski,

You’re very welcome. We’re happy to help !

I think so yes. The function should be positive only if both sensors are at night and if satellite is not in eclipse and visible from both sensors.

1 Like