Determining range with using parallax angle

Hi there,
i’m pretty new an orekit user. First of all thanks to orekit team for this open source library.
Next, i have a question about it. I planned to use orekit for my project about orbit determination.
Range values are calculating via parallax angles from two site optical observations. I used Earl method described at his paper - involved some manifold and conversion calculations: [1504.00965] Determining the Range of an Artificial Satellite Using its Observed Trigonometric Parallax

I wonder if orekit already has built in method(s) calculating this? Searching for forum doesn’t gave me anything about it…

Hi @ggokay welcome.

We don’t have this type of measurements yet, but it would be easy to add. If you want to give it a try (and perhaps contribute it to the library!), then you can look at how other measurements are implemented. The idea is to create a class the extends the AbstractMeasurement base class and in particular provide an implementation of the theoreticalEvaluation method.

Contributions welcome!

One further comment on this measurement.
As the input data are a pair of angular measurements from two known stations, these data could be fed to Orekit as two instances of AngularRaDec or AngularAzEl measurements and the library would do the math. It would even work with measurements that are not simultaneous.

Thank you @luc, i’ll work on this asap.
I think simultaneous measurements are possible only hypothetical manner. So it must be involve some inter- or extra-polation on time domain.

What would be the primary use case for this approach?

Apologies, as I think I might be missing something.

A new combined “parallax observation” doesn’t sound like something that one would use for precision OD along with other observations - because with sufficient observations we can already use angle data, with parallax correction, accounting for propagation between observations with a very accurate orbital model of our choosing, which self consistently evaluates the parallax for the estimated state vector at the time of observation. Perhaps this is the second point made by @luc?

It also doesn’t quite sound like initial OD, because you don’t have enough data.

If observations were “at the same time”, then it could be a way to find a position vector (not a full state vector), which might be useful as a stand-alone function? With non-zero time differences, the error introduced might reasonably be bound by bounding the unknown velocity vector (perhaps specifying the bound in the method API) and evaluating a bounding collection of ranges?

I agree, using the two raw angular measurements as regular angular measurements seems better to me. It doesn’t need approximate modelling of the triangle and works out of the box.

Hello @PeaDubYa and sorry for delay :slight_smile:
I agree with you it’s poorness on obtaining precise parameters. Imagine one, who has angle-only observations but from two GeodeticPoints (at least in this case). He would obtained more precise orbit parameters when add something that contains distance values, for sure. These observations (from at least 2 GPs) can allow him to use other than angle-only methods in this way.
Did i misunderstood, or “parallax correction” and “evaluation” hiddenly includes range calculation inside?

Not in this case because the distance information here is entirely derived from the angular measurements only. It does not add more information. The only distance that allows to convert the original angle pairs into a distance come from the definition of the two geodetic points, which is an information that is also used by the AzEl and RaDec measurements.

Hello and happy new year to all :slight_smile:
I’m working on my problem, and stuck at one point. First, i would like to check my manifold calculations with using orekit’s built-in Vector3D and TopocentricPoint classes. For this, i tried to reduced the problem to “planar geometry”, i.e. planar triangle.
Where satellite at A point, observer1 sits B and observer2 sits C.
a: linear distance between B and C (of course not geodetic)
b: satellite range to observer2
c: satellite range to observer1

Hereafter i’ll use numbers for to explain myself more clearly.
(numbers obtained from stellarium)

from org.orekit.frames import FramesFactory, TopocentricFrame
from org.orekit.bodies import OneAxisEllipsoid, GeodeticPoint
from org.orekit.time import AbsoluteDate

itrf = FramesFactory.getITRF(IERSConventions.IERS_2010, True)
body = OneAxisEllipsoid(Constants.WGS84_EARTH_EQUATORIAL_RADIUS, Constants.WGS84_EARTH_FLATTENING, itrf)
lat1, lon1, alt1 = float(np.radians(39.919869)), float(np.radians(32.854272)), float(874)
topo1 = TopocentricFrame(body, GeodeticPoint(lat1, lon1, alt1), “observer1”)
lat2, lon2, alt2 = float(np.radians(36.908119)), float(np.radians(30.695561)), float(61)
topo2 = TopocentricFrame(body, GeodeticPoint(lat2, lon2, alt2), “observer2”)
date1 = AbsoluteDate(2023, 1, 3, 7, 0, 0.0, TimeScalesFactory.getUTC())

pos1 = topo1.getPVCoordinates(date1, topo2).position # topo1 pv coordinates at observation moment from topo2
pos2 = topo2.getPVCoordinates(date1, topo1).position # same but for topo2

a = np.sqrt((pos1.x2) + (pos1.y2) + (pos1.z**2)) # this is linear distance between B and C

observed vectors, vector_fromTo

angles are topocentric right ascension and declination values at observer#1 or #2

v_BA = Vector3D(282.8514167np.pi/180, -5.927083np.pi/180)
v_CA = Vector3D(283.2695416667np.pi/180, -5.543694444np.pi/180)

v_BC = pos2.normalize()
v_CB = pos1.normalize()

A = Vector3D.angle(v_BA, v_CA) * 180 / np.pi # parallax angle
B = Vector3D.angle(v_BA, v_BC) * 180 / np.pi
C = Vector3D.angle(v_CA, v_CB) * 180 / np.pi

but i never obtain A+B+C equal to 180 :slight_smile: some trials give greater values, some are lesser.
for this example, it’s 181.01929…

perhaps it’s reason very simple, but i cannot find…

pos1 and pos2 are not expressed in the same frame. The origin of the frame is different, but the orientation of the frame is also different because the topocentric frames are aligned with respect to local zenith, local East and local North.

I guess you should put everyting in itrf before combining vectors.

thank you very much for your response @luc

do you mean something like that?

trans1 = topo1.getStaticTransformTo(itrf, date1)
new_v_BA = trans1.transformVector(v_BA)
trans2 = topo2.getStaticTransformTo(itrf, date1)
new_v_CA = trans2.transformVector(v_CA)

new_A = Vector3D.angle(new_v_BA, new_v_CA) * 180 / np.pi

new_pos1 = topo1.getPVCoordinates(date1, itrf).position
new_pos2 = topo2.getPVCoordinates(date1, itrf).position

new_v_BC = pos2.subtract(pos1).normalize()
new_v_CB = pos1.subtract(pos2).normalize() # or simply -1*above
new_B= Vector3D.angle(new_v_BA, new_v_BC)
new_C= Vector3D.angle(new_v_CA, new_v_CB)

Perhaps

new_v_BC = new_pos2.subtract(new_pos1).normalize()

yes sure - new_pos#, i wrote it from another pc, sorry :slight_smile:
but still different from 180 :frowning: this time 182.7272…

Sorry, I misread some parts in your initial message.
If your angular coordinates are right ascension and declination, they are probably in a frame oriented inertially but centered at the observing point, so this is neither an inertial frame nor a topocentric frame.
You may try to build the corresponding frame (this is an interesting but not trivial taks), or you may try to convert everything to inertial so everything becomes consistent with v_BA and v_CA.

Now i think i’m very close to solution, thank you @luc :slight_smile:

v1=Vector3D(282.8514166666667* np.pi/180, -5.927083333333334np.pi/180) # observed from observer1, satellite’s observed ra-dec values
v2=Vector3D(283.26954166666667
np.pi/180, -5.543694444444444*np.pi/180) # interpolated values to date1
date1 = AbsoluteDate(2023, 1, 3, 7, 0, 0.0, TimeScalesFactory.getUTC()) # time measured by observer1

inertialFrame = FramesFactory.getEME2000()
body = OneAxisEllipsoid(Constants.WGS84_EARTH_EQUATORIAL_RADIUS, Constants.WGS84_EARTH_FLATTENING, itrf)
lat1, lon1, alt1 = float(np.radians(39.919869444444444)), float(np.radians(32.85427222222222)), float(874)
topo1 = TopocentricFrame(body, GeodeticPoint(lat1, lon1, alt1), “observer1”)
lat2, lon2, alt2 = float(np.radians(36.908119444444445)), float(np.radians(30.69556111111111)), float(61)
topo2 = TopocentricFrame(body, GeodeticPoint(lat2, lon2, alt2), “observer2”)

pos1 = topo1.getPVCoordinates(date1, inertialFrame).position
pos2 = topo2.getPVCoordinates(date1, inertialFrame).position
dist = pos1.subtract(pos2).norm # linear distance between observer1 and 2
v3a = topo2.getPVCoordinates(date1, topo1).position.normalize() # topo1 coordinates of observer2
v3b = topo1.getPVCoordinates(date1, topo2).position.normalize() # topo2 coordinates of observer1

v3a1 = topo1.getStaticTransformTo(inertialFrame, date1).transformVector(v3a) # observer2’s coordinates as seen by observer1, transformed to inertialFrame
v3b1 = topo2.getStaticTransformTo(inertialFrame, date1).transformVector(v3b) # same for observer1

a = Vector3D.angle(v1, v2) * 180 / np.pi
b = Vector3D.angle(v1, v3a1) * 180 / np.pi
c = Vector3D.angle(v2, v3b1) * 180 / np.pi

a + b + c = 180.0000005702155
residual: 0.0020527758 arcsec

these values obtained from stellarium. So, what do you think about difference?

This seems fine to me.
I don’t know what stellarium uses as an Earth frame, and what they use as Earth Orientation Parameters, so I would even have guessed you would have larger differences.

The final difference you got is on the order of 1e-7 degrees which is more than enough I believe.

Ok, finally it’s about to be completed :slight_smile:
But, i don’t figure out one point. I obtain (ra, dec) measurements of same satellite from two different sites, synchronized by linear interpolation on time. I have range measurements from radiometric observations too.

But in all data sets, radiometric range values differs from range values obtained by optical observations. Of course this is acceptable, but there are something systematic i think. I don’t know why. It seems in all optical observations, satellite going to away by time. May be zenith angle decreases by time… but why ?! Optical range values have positive inclination. Their inclination are positive on different times over year. If the slope was nearly zero in July, i would decide the reason was aberration, but it’s still >0. I checked out nightly temperature and pressure variation of atmosphere for refraction variation, but nope.

All optical observations were registered by using gaia dr3. GAIA-CRF3 frame taken into account. Because of the satellite is geo and altitude of it nearly same for all observation, and stars using for coordinate registration are affected by same effects, no aberration, refraction, etc. corrections were applied.

In attached figure, blue dots are radiometric range values, red dots optically determined range values from near the “blue dot station”. Green dots represents range values from another station.

Figure_1

It seems the effect(s) is time dependent. I don’t find anything for to explain that. What do you think about it?

Hey there,

I’m joining the conversation a bit late so sorry if I’m missing something.

Does the paper and/or your code include the delay caused by signal travel?

I’m not sure to understand, but you say that you do not apply any corrections to the optical data? If they come from astrometry, aberration does need to be added as the catalogued stars don’t have it. That being said, maybe your data provider has already done it.

Best,
Romain.

Hi Romain and thanks for your time :slight_smile:

You are right, no corrections have been applied to data. But light travel time in the order of 0.1s must be taken into account finally. But only effect of this is shift on time (x) axis and almost linearly (variation range of the satellite range value ( :slight_smile: ) nearly 20km).

Is aberration caused systematic error like this? Also, if catalog coordinates of stars not included aberration, and satellite coordinates determined from them, isn’t that also make satellite coordinates aberration-free?

Another question, is the “nightly aberration variation” compensates this seemingly time dependent residue?

Regards,
gökhan