We’re using the python wrapper to start an orekit java VM on our backend. However, since our application can run for long periods of time, there is a need for us to refresh the orekit VM to always get the most up to date data.
We’ve tried a couple things to make this happen, none of which are able to properly clear the cache and pull in the new data. Generally, the way we are doing this was to just run setup_orekit_curdir(filename=path)
on an interval where path is an fileshare that gets new/modified data.
We then tried to deconstruct that helper function to ensure that 1). Our test data is getting passed to the crawler and 2). when we add the provider with the crawler, then that new data propagated to the vm. Which we have looking something like this:
if not orekit_vm:
log.info("Initializing orekit")
orekit_vm = orekit.initVM()
else: log.info("Refreshing orekit")
orekit_vm.attachCurrentThread()
DM = DataContext.getDefault().getDataProvidersManager()
data = File(path_to_orekit_data_folder)
crawler = DirectoryCrawler(data)
providers = DM.getProviders()
for provider in providers:
log.info(f"Removing: {provider}")
DM.removeProvider(provider)
DM.clearLoadedDataNames()
DM.addProvider(crawler)
Is there something we’re missing to fully clear the cache? Or is our only option just to fully destroy/restart the JavaVM?