As it has been mentioned in a few posts by myself, I am looking to upgrade the hardware that my kit runs on. In my research I have found that I would be best using phpfina as this will be much easier to schedule a backup, as the mysql one is taking about an hour with the compression.
it is a hot backup that I am doing, and so I have no data loss, using mysqldump I have a 15 minute window of dataloss, which while not being the end of the world, means I do have a gap in my data.
I believe that phpfina will get around this.
Is there an easy way to convert all my existing mysql feeds over, I have read Moving from mySQL to PHPFINA, but this does not seem to give me the results I’m looking for.
I have had a play with the scripts that are in the useful scripts/convertdata/archive and the sql to phptimestore looked promising to start off with, but this could not find any suitable feeds.
Any ideas, or do I just start a new feed in phpfina and “archive” the old away?
After a while of trying I think I have found a way, but it is very resource intensive.
In a nut shell, I grab the feed details from the “feeds” table, and only make notes of the ones which are mysql Timeseries. This gets stored in an array.
I look through this array writing the contents of the table to a csv file as:
time, data
I write one of these per feed
Then in python, I work my way though each csv file and using the bulk importer and an useful node id I then import them back into the system, 8000 data points at a time.
For testing I am doing this on a fresh install on a cloud server, with the only process lists as log to feed.
Initial testing, has got a single feed of 14million datapoints converted over and imported in within a time window of 90 mins.
I will zip up the scripts I have used once I have done this and got it working as expected, as I know others were looking do do this some time back, so may still be wanting to.