How to automate the feed sync between local installation and emoncms


I’m using the EmonCms 11.3.22 and the Sync version 2.1.4 and I would like to automatically sync a feed between the local instance and the emoncms public instance. Right now, I’m using the Emoncms Sync functionality but I have to regularly press the upload button to sync the remote feed with the local one. The local one is getting created by a new python service getting data from melcloud and posting locally via the API.

I tried to found some documentation on how to sync … and I was not able to find any good article on how to setup such automation.
Happy to have a look into the code if required.

Thanks a lot for the help.



Hi @AlessandroVanzulli, I have the same question. I was suprised that this does not appear to be documented somewhere.

@TrystanLea is there a feature in emoncms to automatically sync feeds between instances or do we need to write a script or otherwise schedule somehow?

1 Like

I just stuck this in cron on my local instance so I do daily uploads of the feeds i’ve preconfigured in sync feeds.

#01:01 am - sync local feeds to remote
1 1 * * * php /opt/emoncms/modules/sync/sync_upload.php 2>&1

As @Zarch mentioned. Yes this needs documenting!


Thank you @Zarch I have now added a cron job to sync every 5mins - simple when you know how!

#Every five minutes sync local feeds to remote
*/5 * * * * php /opt/emoncms/modules/sync/sync_upload.php 2>&1

@TrystanLea - Perhaps, a systemd service could be added to the codebase to sync the data and that service could be enabled from the UI?

Yes I think something like that would be nice.

The sync module was not really initially designed for use for regular data uploads. It was more designed as a tool for moving data from a local system to a remote system or vice versa or/and backups.

But it would be nice to make it more of a continuous replication option. The upload of the individual feed data is quite efficient as it’s done via binary, but the preparation for the upload and the way a request is made for each feed upload is less efficient, Ideally some way of combining all the requests in a single packaged upload would be good…

1 Like

If adding flexibility to this, perhaps being able to define different targets with different feeds?

I’d not thought about using it to duplicate between systems. I have a partially finished migration stopped as the syncing of the data via rsync takes so long :frowning:

Yes that was part of the reason for the sync module.

One key downside with the current implementation is that it only uploads or downloads data that is new e.i after last end time of the feed that is behind local or remote. It does not upload edited datapoints if they are made before the end time.

Ah OK. If it is efficient, then a “just copy everything” option would be great.

If you have done a backup/restore (by any means), the inputs on the new system are running, then run a sync from the old system that should fill in all the gaps during migration.