Updates for High Fidelity Heliocentric Propagation

Ah ah, yes if you forget to setup checkstyle from the beginning that usually happens :wink:

Not sure. The first thing to do would be to checkout a new branch from your develop branch and call it “issue-1681”. Then push this branch on your fork and see if SonarQube can give you a delta of the new code with respect to the run on “develop” branch that you already have.
That is what we are interested in when looking at your merge request.
Then, I don’t know if you can change the source branch of your merge request. Maybe you will have to close the current one and open a new one from the new branch.
We’ll try to walk you through the process

1 Like

I have now checked out a new branch called issue-1681!

However, even after running the pipeline on this new branch, it is not added to the list of analyzed branches on my project on Orekit’s SonarQube. The master branch is listed there, though.

I guess the “Project’s Main Branch is not analyzed yet.” is due to my fork not having a “main” branch, but I have seen other projects have managed to get other branches added to the list of analyzed branches without having a main branch, such as this one.

I have tried to run the pipeline on other branches of my fork, including branches not created by me (e.g., release 12.0), but that also didn’t work.

I have noticed, however, that for all the Orekit forks currently analyzed in Orekit’s SonarQube, all of the branches are in status “Passed”. I guess this means they passed the pipeline run?
I am wondering if my issue-1681 branch is not being added to the list of analyzed branches because the verify part of the pipeline fails?

On the other hand, it seems that indeed it is not possible to change the source branch of the merge request, so I was thinking of indeed opening a new one after managing to get the new branch analyzed on SonarQube.

Thanks a lot again for your help in setting this up!

Hi @Rafa,

Thanks!

Yes if you have errors in the tests of the “verify” job, then the Sonar analysis is not launched…
You have to fix those tests if you want the branch to appear in Sonar.

Yes, sorry about that.

You’re very welcome @Rafa !

1 Like

Thanks a lot for the clarifications!

Regarding the errors in the “verify” job for the pipeline on my issue-1681 branch, it seems that some (all?) of the remaining errors are coming from test suites not related to my new added code, e.g., for OrekitExceptionTest.testBadMessage.

I guess I should not be worried about this? But in that case, it seems like I can’t deliver on the goal of getting a delta of the new code on SonarQube.

Sorry if I’m missing some obvious solution here!

Hi @Rafa,

I’ve checked out your branch and ran the tests locally.
Actually, the problem is that you added new exception messages, but didn’t update the translation files accordingly.
Indeed, when adding a new exception message, you have to add it to all localization files.
You don’t have to translate the message by yourself. But still, it has to appear.
The master file is the English one. You have to give the key of the message and repeat the English text here.
For the other languages, simply copy/paste the English messages you just added and replace the translated text with<MISSING TRANSLATION>.
If you have time, you can translate the messages in your mother tongue :wink:
See for example:

Last thing, you will have to increase the number of messages in test OrekitMessagesTest.testMessageNumber().

Sorry, it’s a bit of a hassle to do, but it’s necessary for now when adding new exceptions in Orekit.

Cheers,
Maxime

1 Like

Thank you so much for the detailed explanation! That was really helpful, and finally got the pipeline to run properly! I included the translations for the new exception messages.

I’ve now opened a new merge request with the correct branch as the source, closed the older one, and my issue-1681 branch is in the list of analyzed branches for my fork in SonarQube.

I also took the chance to make another merge request to add the missing Spanish translations, hopefully helps tackle this goal :slight_smile:

Best wishes,

Rafa

1 Like

Hi @Rafa,
I am having a look at the code you posted, but since I am not an expert programmer I think its best to ask you directly. I have opened an issue #1571 regarding the implementation of .bcp file from JPL to compute the moon pole with precision. What have you implemented so far is capable of reading such file and extracting the polynomials coefficitent that need to be used for the Moon pole definition ?

Thanks in advance and great work !
Davide

1 Like

I am precisely working on reproducing the trajectory of GRAIL-A with high fidelity! So this is something of great interest.

The file that you have linked is of type PCK. This is a filetype that I have in mind adding support to in the short term, and it should not be too hard, but not something that can be handled by the code in the merge request I made yet.

PCK specifications are defined in the framework of DAF files, for which I wrote code to support, since that is also required for SPK files. So it will just be a matter of formatting correctly the values read as a DAF file. Luckily, there is only 3 types of PCK (versus nearly 20 SPK), which makes things easier.

I have in mind working on these and other DAF/SPK related additions in the next coming weeks!

1 Like

Thanks, very clear I will try to do some test myself with the DAF parameters. I keep you posted if I get something.
In addition my benchmark will be done againts a parser that I have writted which tooks directly the quaterionion from the cspice output (not elegant but for low rotation rate as per the Moon Pole it quite accurate).

Keep it up!
Davide

1 Like

To be specific, I guess you could hack your way into it doing something like:

  • Read the file as a DAF
  • But this will be a rather meaningless set of numbers, except a very generic header
  • You will need to locate in this mess the coefficients. I think you will likely be handling a Type 3 PCK, which provides Chebyshev coefficients for an angle and their derivatives. I think this will likely be the case because text-based DE files provide lunar mantle libration angles and angular velocities, so I guess these will be equivalent to having the angles and derivatives.
  • You could then pass them onto a Chebyshev interpolator

I can’t promise anything, but I might have some working prototype next week of a proper implementation formatting DAF contents into a PCK kernel

1 Like

@DDega I have a first draft for handling PCK kernels on my issue-1571 branch. It is very early and I still need to add a test suite, have to refactor quite a few things from the DAF and SPK modules but from my quick tests, it seems to parse and evaluate properly the bcp file you shared, in case you want to give it a shot!

Thanks, I will have a look at it.