Importing historic data from .csv file onto fresh Emoncms

After 3 years of running daily with no problems, yesterday I tried to update my Emonpi. I wanted to set the IP address after some home network changes and was unable to use SSH on that older version (don’t know the number). Unfortunately update went wrong and I have ended up starting from scratch flashing the SD card with a new version, and I’m still trying to get my Emonpi functioning again.

Assuming that I succeed (it is saying"updating…" again), I have the old data up to yesterday evening in csv files and I would like to be able to view it via the dashboard. Is there a way to import this onto the Emonpi?

Hi Dave, welcome.

That is a shame - we probably could have helped :smile:. Always worth using a new card when doing this so you can recover data.

How did you create this backup as the normal backup is not csv format?

Thanks for such a quick reply. What started as a simple job (setting IP address) has turned into a bit of a monter. I didn’t know I was in trouble until it was too late, so often the case.

The version I had (from April 2016, possibly SD-17Jun2015_2) didn’t seem to have a back facility. I pulled the .csv files off the SD card after I had extracted it. The data looks complete, but I don’t know how to get it back onto the emonpi.

I have also yet to get into the emonpi having flashed the recent software (adjusted for RPi2) onto it. It sat saying “updating” for ages.

Every thing worked so nicely fo 3 years I have forgotten how to set up, create dashboard views etc.

I suspect these are the data files, but not strictly csv although they do have commas in them :smile:. What are the filenames? This is probably the raw data, unfortunately the dashboards etc are held in the SQL database so will be lost.

By fix RPi2 I presume you mean the overclocking? Did you do the SSH ‘fix’ before first boot? It does take a while - are you connected by Ethernet to it?

If it isn’t wanting to boot, the only real solution to troubleshoot I can suggest, is to take the RPi out of the enclosure and connect a monitor to the HDMI port so you can see what is happening.

I’m a db guy professionally so able to convert raw data to suitable SQL statements if I can get at the db. It is playing with basic processors that does my head in. :smile:

The filenames are “1 Power CU.csv”, “3 Temp Tank Top.csv”, “4 Temp UFH feed.csv”, “7 Power PV.csv” etc. matching the sensors/data feeds that I have. The file contents is comma separated data > 300,000 lines per file of what I assume is a time value as an integer and a power or temp reading. So yes historic raw data.

Emonpi has finally finished “updating”, now showing “Ethernet: Yes” and its IP address. I can see the inputs listed via ethernet on the Web browser so I am moving forwards. I guess I have to try and remember how to congfig things again.

Jumping ahead of that (did it before so must be able to do it again, right?), what kind of db does it have and how can I access it. Via SSH?


It uses the standard MariaDB for somethings and usually a time series flat file store otherwise. I’m surprised that the data is as you describe - it may have been a SQL feed, so sorry cannot help further :frowning_face:. Others may be able to!

Yes by SSH. You can enable it via the push button on the EmonPi.

I have found the export facility that creates .csv files of the feeds, but looking through the useful scripts I don’t see anything to help import any data exported in that way.

I’m hoping someone can point me at something that does it. I have my Emonpi running again, and can SSH in as I wanted, but I so miss seeing the 3 years of historic data I wiped off.

Can you point me to that please.

@borpin creation of csv is documented Exporting CSV - Guide | OpenEnergyMonitor
What I did is at the at bottom of the page, I accessed the export tool from the feed list interface.

Every day a learning day.

I’m not sure if there is a reverse process; @glyn.hudson or @TrystanLea will be your best bet.

For future reference, you will find on the new image a Backup Module - that is a much better way to backup data & setup information.

I don’t think there’s any particularly easy way to do it. I hope somebody who knows better can come along and tell me I’m wrong.

AIUI, the system originally kept data in a mysql database, but then discovered that doing so loaded a pi excessively and wore out the SD card in a very short time. So they moved to a system of open format flat files, with an in-memory buffer. I asked a similar question a while ago and got a discouraging reply. Importing data to an emoncms system

I’ve thought about writing a program to input historical data, and then I thought about writing a new file format that would make such things easier, and I imported the system onto the machine I use for development but have never got it to run because the system is too closely linked to debian-based systems at present, and I lost motivation.

I just backup the data files on the system every day to an external backup disk using scp in a cron job. I hope that will be enough to restore my system if it ever dies. Doesn’t help in your current situation though, sorry.

Thanks for that @djh. I was surprised at the use of MySQL for such a job (what the guide says), and it makes more sense of what I can see via SSH. It actually isn’t using MySQL for the data at all, just for the CMS config stuff e.g dashboards, feeds, graphs.

But a bummer because the flat files are harder to manipulate and fill with older data than SQL tables would be. I wonder if I can put something together that will put my .csv data into the file format (whatever it is)? OSS so somehwere in GIT are all the answers, but it woud be easier if someone can just tell me.

That shouldn’t be too difficult as the formats are pretty basic.

The 2 main feed types are phpfina (PHP fixed interval no averaging) and phptimeseries.

phpfina is basically a pair of files, a .dat which is a stream of 32bit floats end to end and a .meta that records the interval and start time. Each 4bytes is a “interval” from the “start time” eg for a 10s feed the 40th - 43rd bytes are for start time + 100s (place in file 40/4 bytes = 10 intervals @ 10s). There is no time data stored for each datapoint as it’s position relative to the begining of the file (start time) gives you the timestamp. Any missing datapoints are null. (see Learn | OpenEnergyMonitor)

phptimeseries is a stream of 4 byte integer timestamps (unix) and 4byte floatss plus one unused byte (9bytes in total). (see Learn | OpenEnergyMonitor)

Which type you use will depend on your data.

Another way to do this could be to create a simple script that uses your csv to repost the data in chronological order starting with the oldest/first data. emoncms can accept data that is up to 4 years old and has a bulk upload API so you could upload via the emoncms inputs rather than playing with raw files, neither is ideal I know, just giving you another option.

Did you flash the new image to your original sdcard? If you have by any chance used another sdcard, you could easily mount your old sdcard and get at the raw data files and mysql table even though the update failed. The data will not have been affected by the failed update, just the running of that os and emoncms instance will have been screwed up by the update so retrieving the data may still be possible unless you have over written with the new image.

Thanks @pb66 that is all the info I need. I’'l report back if I am successfull incase of use to others.

Unfortunately I flashed the original SD, it was the only suitable card I had around. Since I couldn’t SSH (why I tried to update) and looking at the SD in Win10 all I could see was the boot sector then I took the only option I could see to get data off (as .csv) and reused the card. Not sure what I could have mounted it on to look at it?

Once you had a new image running on a new sd card you could have mounted the old ad card directly to the emonpi via a usb card reader and dragged the files over. Hopefully you won’t find yourself in the same position again, but if you did, I would recommend just putting the failed sd card to one side and get another image up and running, from there you can recover your data or maybe even repair the original image.

It is very odd that you could not SSH. From what you said it seems emoncms was still working so you could get the csv extract.

I hope I won’t, but need to buy both spare SD card and USB card reader (mine is built in to my Nuc). I’m just not set up to deal with SD cards and RPis

The old version I had installed did not have enabling it as an option when cycling the menu as the guide described, so I assumed SSH was not there. If it was actually enabled by default all along (with some unknown account and password) then that is frustrating, but what’s done is done.

One thing I have noticed on the latest version is that the LCD on the emonpi is permanently lit up, on my older version it used to dim after a timeout. By design or a bug?

Ah, assumptions - make an ass (of) u (and) me or are the mother of all foul ups!!!

It was enabled with a default password :cry:.

I suggest you have a search here and if you don’t find the answer start a new thread. There has been a few bits on the LCD recently and there is a config file you can edit (via SSH).

I thought emoncms couldn’t accept data earlier than the original creation time of the files? If that’s not true, where can I read how to do this, please?

That is correct, hence my advice to “repost the data in chronological order starting with the oldest/first data”. When you “create a feed” in the input processlist it is done in 2 parts. First you define the feed to be created, that’s the bit everyone familiar with. The feed is initialised as a separate operation when the feed is written to, this is usually almost instantaneous as most input processing is done on live inputs so within seconds of defining the feed it gets initialised when the input is next updated.

The start time of the feed is not set until that first datapoint is written so if your first data update is for 4 years ago, that will be the feeds start time.

The trick is to create the inputs via a single api call (or by a short “dry-run” of your historic data upload script) so that you have inputs to add processes to and create feeds for, but without any data flowing until you have created your feeds and are ready to commence sending data starting with the earliest data first.