Community
OpenEnergyMonitor

Community

Emoncms automatic data backup from EmonPi to Remote Computer

emoncms
Tags: #<Tag:0x00007f10a5135378>
(Neil Hastings) #1

I wish to automate a daily emoncms backup to a remote computer. I’m able to do this using Node Red.

Node Red Flow:

[{"id":"c170246c.445538","type":"exec","z":"a5ce04b2.5a31f8","command":"sshpass -p \"password\" scp ~/data/emoncms-backup-* [email protected]:~/Documents/RPi/EmonPi/Backup/","addpay":false,"append":"","useSpawn":"false","timer":"","oldrc":false,"name":"SCP >> Remote","x":460,"y":200,"wires":[[],[],["8ad4e228.87d7e"]]}]

This flow runs ./emoncms-export.sh then scp’s the file to the remote computer and finally removes the original file from the local computer. It’s error checked at each stage. This works fine, but I wish to make the backup and store remotely without saving the file to the local computer SD card. i.e: zip and scp to remote directly, no file saved locally.

So I thought I could modify…

gzip -fv $backup_location/emoncms-backup-$date.tar 2>&1 in emoncms-export.sh

by piping the scp command, something like this…

gzip -fv $backup_location/emoncms-backup-$date.tar 2>&1 | sshpass -p "password" scp ~/data/emoncms-backup-* [email protected]:~/Documents/RPi/EmonPi/Backup/

(I know I can create ssh keys to eliminate the password requirement)

I think I’m close, but not there, I know this piped command does not work as it creates two remote backup files filename.tar (7mb) and filename.tar.gz (1.7mb).

If I can get this to work I’ll then add it as a cron job.

Can anyone help me to define the correct syntax for piping gzip and scp? If this is possible.

Thanks in advance

Neil

0 Likes

(Bill Thomson) #2

Hi Nell,

Here’s a link with some info on one way to do it:

If I’m reading it correctly, the main point is to fully qualify the archive’s destination name.

0 Likes

(Bill Thomson) #3

Here’s another:

0 Likes

(Neil Hastings) #4

Hello Bill,

Thanks. My aim is (was) to eliminate saving the zipped file on the SD card and zip directly to the remote to reduce SD read/write.

Is this worth doing? Or is writing, copying and removing one backup file per day not a problem for the SD card longevity?

0 Likes

(Bill Thomson) #5

How much data are you archiving?

0 Likes

(Neil Hastings) #6

Right now it’s 1.7mb but I expect that to grow. I only have a few weeks of data so far.

0 Likes

(Bill Thomson) #7

Pipe takes the output of one command and feeds it as input to another command.
If your archive destination was on the same network as your archive source, then it would
be a matter of changing directory to the destination and running the commands on the source.

But as the second link I posted mentions, scp doesn’t know how to deal with STDIN/STDOUT
but ssh does. Have you had a look at the second link?

0 Likes

(Bill Thomson) #8

And yet another:

0 Likes

(Bill Thomson) #9

The article at the last link I posted sounds exactly like what you’re after.

0 Likes

(Neil Hastings) #10

Bill,

I looked at the 2nd one. I’ve also seen many on the web. My problem is that I’m not that experienced in linux commands. I’ve tried many of the solutions offered, all to no avail.

I’ll try this again later and will reply here if I make progress.

Thanks

0 Likes

(Bill Thomson) #11

OK. I posted a third link while you were typing…

0 Likes

(Neil Hastings) #12

Thanks. I’ll explore the ssh option rather than scp.

Have a great day.

0 Likes

(Dave Howorth) #13

Rather than bother trying to compress across the network, another technique would be to simply compress it first on the pi and then just copy the compressed file across. Since you don’t want to write it to the SD card, just write the compressed file to one of the many tmpfs (i.e. RAM-backed filesystems) on an emon system. /tmp stands out as an obvious choice.

Also, I’m not sure what Node-RED is helping with. Isn’t it simpler to just write a script and then put it in a cron job? Or even a systemd timer unit if you’re feeling ultra-modern (and brave).

1 Like

(Eric Wouters) #14

sure that is viable ? I have feeds running for over 2 years now (just emoncms on a real server) , one of them is 787 Mb … the second is 383 Mb … not sure even compressed that will fit in ram tmp file … or how long the pi would work on those kinda files …

1 Like

(Neil Hastings) #15

Hello Dave,

This I understand and can make work. Thanks

Your right, making Node-RED do it made me realize I could make the emoncms-export.sh script do the same thing. I’ll read up on system timer. I’m feeling brave… :smile:

Regards

0 Likes

(Brian Orpin) #16

If you have certbot installed, it does just this for the certificate renewal - a good example to follow.

0 Likes

(Brian Orpin) #17

There is an old dropbox backup thread here Emoncms backup to Dropbox if anyone is interested.

Only problem is Dropbox now enforce a 3 machine limit on the free accounts for syncing.

Anyone looked at using GoogleDrive from the command line? I know it is possible as my WordPress site and my HA installation backs up to my GoogleDrive using an API key.

0 Likes

(Bill Thomson) #18

Ya musta’ missed that one, Dave… :wink:
In his first post he said:

I thought about suggesting that too. But, as Eric alluded to, the OP said

So while it may fit now, the file may outgrow tempfs.

0 Likes

(Bill Thomson) #19

Sounds like he’s still a bit new at it, or at least not comfortable with the advanced aspects of Linux

0 Likes

(Neil Hastings) #20

I’d say my experience in Linux is ‘patchy’. But I don’t mind pressing buttons and asking why later :smile:

0 Likes