Finding Sat roll angle to topocentric frame

Hello!

I just want to say firstly, I know that no one is making you help us here, so I want to thank you up front for your time!!

What I want to do is find out what roll angle is needed for a satellite to see a Topocentric frame. I’m having trouble figuring out how to find the roll angle from a satellite position to a Topocentric frame translated to the sat’s orbital frame. To determine the desired angle I have this (borrowed from the forums here):

LofOffset VVLH = new LofOffset(s.getFrame(), LOFType.VVLH);
SpacecraftState converted = new SpacecraftState(s.getOrbit(), VVLH.getAttitude(s.getOrbit(), s.getDate(), s.getFrame()));
		    	
TimeStampedPVCoordinates pv2 = converted.toTransform().transformPVCoordinates(farmFrame.getPVCoordinates(s.getDate(), s.getFrame()));

double xAngleFromSpaceCraft = Vector3D.angle(pv2.getPosition(), Vector3D.PLUS_K);

From what I understand, this should give us an angle from the Topocentric frame to the z axis in the satellite’s orbital frame, which should be pointing towards the earth.

Right now, i’m triggering this code when the Topocentric frame triggers a field of view detector with a Nadir attitude. The angle I get appears to be off. Furthermore, when I make the field of view detector larger (like increasing the half-aperture angle) the ‘angle’ I get from the code above grows too. Is there a way to do what I am trying to do?

How should I be thinking about this problem?

Thanks again for your support on everything we space junkies try to do here.
Orekit 10.3

Thank you for your time!!!

Evan

Hi Evan,

I think the answer depends on other constraints on the satellite attitude or how you define the roll angle. For example, the boresight of an instrument could be made to point in any direction using only pitch and yaw.

One option could be to create a Rotation that rotates from the LOF frame to the correct attitude. Then the Rotation class has methods to get the angles according to different conventions. E.g.

new Rotation(boresightInBodyFrame, directionToGroundSite).getAngles(...)

Note that this example leaves the clock angle around the boresight unspecified. It can be defined using the four vector constructor of Rotation.

Perhaps another convention would be to project the direction to the ground site to the radial/cross-track plane and then take the angle w.r.t. the nadir vector.

Best Regards,
Evan

I think what I want to do is the same as you, I think it is to calculate the Angle of the Z-axis measurement pendulum of the satellite to the center of the ground station, that is, the measurement pendulum Angle of the satellite. Of course, the measurement pendulum Angle of the satellite has a constraint of plus or minus 45°, and I think I can output the measurement pendulum Angle regardless of whether it is satisfied. Have you made any progress on your part?

Hello,
sorry for bringing up an old post, but I have seen that no solution has been provided.

I am currently in need of evaluating the roll, pitch and yaw angles needed to point at a target on ground. I have defined the attitude as LVLH_CCSDS and the boresight in body axes is +Z.

Could you provide a working snippet that evaluates the angles required to point at a GeodeticPoint with the satellite boresight?

I would suggest in this case to use the AlignedAndConstrained attitude mode, with a GroundPointTarget for the primary target. This would directly provide an attitude pointing to the desired location on ground, and state.getAttitude().getRotation() would get the rotation r1 from inertial frame to the pointing attitude.
Then, you could call LVLH_CCSDS.rotationFromInertial(state.getPVCoordinates()) to get the rotation r2 from inertial frame to LVLH.
Then, by calling r = r1.applyTo(r2.revert()) you would get the rotation from LVLH to pointing attitude, and you could extract the angles from this rotation using r.getAngles(order, convention).

Dear @luc,
thank you for your answer. In the meantime, I had thought of something similar to your approach that seems to “quite” work still.

eci = FramesFactory.getEME2000()
itrf = FramesFactory.getITRF(IERSConventions.IERS_2010, True)
attitudeProviderNumerical = LofOffset(eci, LOFType.LVLH_CCSDS, RotationOrder.XYZ, 0.0, 0.0, 0.0)
numericalPropagator.setAttitudeProvider(attitudeProvider)
...
target_ground = GeodeticPoint(lat, lon, altitude)
target_frame = TopocentricFrame(earth, target_ground, "Station")
...

attitude = sat_state.getAttitude()
geoPointInRefFrame = target_frame.getPVCoordinates(state.getDate(), eci).getPosition()
sat2GeoPoint = geoPointInRefFrame.subtract(sat_state.getPVCoordinates(eci).getPosition())
geoPointDirectionInSatFrame = attitude.getRotation().applyTo(normalize(sat2GeoPoint))

rotation_angles = Rotation(Vector3D.PLUS_K, geoPointDirectionInSatFrame).getAngles(RotationOrder.ZYX, RotationConvention.VECTOR_OPERATOR)
angles_deg = [degrees(angle) for angle in rotation_angles]
yaw_deg, pitch_deg, roll_deg = angles_deg

This seems to work as I obtain a roll angle very similar to my incidence angle (since I am analyzing a zero-doppler condition) and small (around 1°) yaw and pitch angles.
The fact that yaw and pitch angles are small but non-zero is worrying since I am either not correctly estimating the three rotation angles (probably since the attitude is defined in eci and not ecef?), or I am not in the zero-doppler condition when evaluating my SpacecraftState.

I am currently using the ElevationExtremumDetector in order to check when the elevation is maximum (and therefore, minimum incidence angle) which should correspond to the zero-doppler condition case: but if I compute cdot(los_ecef, v_ecef), the value is around 0.02 and not 0.
In the case this is not right, is there a detector to obtain the state at which the spacecraft and the geodetic point respect the zero-doppler condition, cdot(los_ecef, v_ecef = 0)?

I guess that I should be able to see both los_ecef perpendicular to v_ecef, as well as only a roll rotation required to point at the target, and not any residuals in any of the two.

Your solution creates the rotation from only two vectors: Vector3D.PLUS_K and geoPointDirectionInSatFrame. There are an infinite number of rotations that would fulfills this, as once you have a first rotation you can combine it with any rotation around the satellite K axis and still point to the station. The two vectors constructor for rotations select one rotation among this infinite set: the one with the smallest angle.

This is the reason why @evan.ward above suggested using the four vector constructor of Rotation, but you have to specify by yourself how to consider the two other vectors. This is in fact what is done internally by AlignedAndConstrained and this is the purpose of its secondary target provider. You could for example say the primary alignment is to have Vector3D.PLUS_K pointing exactly towards your ground target, and that the secondary consraint is to have Vector3D.PLUS_X pointing roughly to PredefinedTarget.VELOCITY for example. There are several other predefined targets that could be useful.

There are no such detector, but it is easy to create one: you already know its g function, it would be cdot(los_ecef, v_ecef).

Dear @luc,
thank you so much for your inputs! Now I am able to get much better results.

I have implemented the detector as follows:

class ZeroDopplerDetector(PythonAbstractDetector):

def __init__(self,
             max_check=AbstractDetector.DEFAULT_MAXCHECK,
             threshold=AbstractDetector.DEFAULT_THRESHOLD,
             max_iter=AbstractDetector.DEFAULT_MAX_ITER,
             event_handler=ContinueOnEvent()):
    super().__init__(max_check, threshold, max_iter, event_handler)

def g(self, spacecraftState):

    target_frame = TopocentricFrame(earth, target_ground, "GS")
    sat_pos = spacecraftState.getPVCoordinates(itrf).getPosition()
    sat_vel = spacecraftState.getPVCoordinates(itrf).getVelocity()
    
    geo_pos = target_frame.getPVCoordinates(spacecraftState.getDate(), itrf).getPosition()
    geo_to_sat = sat_pos.subtract(geo_pos)
    return Vector3D.dotProduct(normalize(sat_vel),normalize(geo_to_sat))

and it is correctly converging to the state I need (which was around half a second different from the previous one).

I have also implemented the attitude as you have suggested:

target_ground = GeodeticPoint(lat, lon, alt)
target_frame = TopocentricFrame(earth, target_ground, "GS")
PV_target = earth.transform(target_ground)
GPT_target = GroundPointTarget(PV_target)

attitudeProvider = AlignedAndConstrained(Vector3D.PLUS_I, PredefinedTarget.VELOCITY, Vector3D.PLUS_K, GPT_target , sun, earth)

r1 = state.getAttitude().getRotation()
r2 = LOFType.LVLH_CCSDS.rotationFromInertial(state.getPVCoordinates())
r = r1.applyTo(r2.revert())
rotation_angles = r.getAngles(RotationOrder.ZYX, RotationConvention.VECTOR_OPERATOR)
angles_deg = [degrees(angle) for angle in rotation_angles]

and now I am getting the roll values I expect, and yaw/pitch angles are around 0.01°-0.02°.

Is there a reason why they are not actually 0? I have tried also inverting the primary and secondary target and axis in the attitude definition, but I get worse results. Since the detector is converging correctly (the dot product is E-14), I was also expecting null values for these two angles. Is my implementation still missing something?