Convert mySQL feeds to what?

Short story, my 30GB mySQL database that grew up for the last 10 year is to much for my hosting plan that has now a new limit of 3GB per database. Disk space is not an issue yet, so i will be converting my feeds to data files.

Is the current recommended engine for this still phpTimeSeries or should i care about anything else taking in consideration emoncms support in the future?

Looking to use : usefulscripts/sql_to_phptimeseries_mv.php at master · emoncms/usefulscripts · GitHub

Thank you

Hello @nchaveiro

Im glad you are considering changing to phptimeseries or possibly phpfina and that it potentially solves a specific problem for you with the hosting.

The format of the phptimeseries data files is exactly the same as the mysql feeds (in terms of the binary structure) hence why phptimeseries still has the .MYD extension. I initially made use of that on emoncms.org to just switch GB’s of data over to phptimeseries without requiring a conversion process, just a command line file move and then changing the engine field in the feeds table.

It looks like that archived conversion script goes through the mysql data row by row so does not make use of that potentially very fast switch over but perhaps it is suitable for your hosting configuration if you do not have file level access to the mysql data?

As that script is archived I’m not sure if it will still work now, so it’s probably worth proceeding carefully.

If you have data recorded at a fixed interval and the data completeness is >50% you will save disk space with phpfina. PHPFina is also quite a bit faster on the visualisation side (as it doesnt require a binary search algorithm) and I’ve focused more of my development time around that engine in terms of some of the other capabilities in emoncms like postprocessing or the sync module.

I do have some significant changes in the pipeline that bring some of the benefits of phpfina to phptimeseries such as on the fly averaging and full support for timezone aligned data requests. I hope to get back to that development soon.

Let me know if you are able to access the mysql data files directly

Thanks for the answer. Have no access to that mysql files but seems it would not work either as mysql data has repeated times with the same or different data, many from network retries of my posting devices along the years and it seem that phptimeseries can only have unique time stamps.
Meantime tried the conversion script and it’s painfully slow. Tracked to the phptimeseries engine that keeps opening and closing the file for write at each row post.
I’m now trying a direct approach writing directly from the convert loop instead of using the engine. It’s 20x faster.

Edit
Here is the convert script i’ve used for others who need it:

1 Like

Great to hear that you solved this with a direct approach, I agree much faster. Thanks for the associated pull request to the usefulscripts repo.

An additional side effect is that the 30GB mySQL feed data takes half the disk space in phpTimeseries format, 15GB.
Hope it goes well for the next 10 years to come :slight_smile:

That’s great to hear, and it might take up half the space again with PHPFina if the data is highly regular e.g always at 10s resolution without large data gaps. Would converting some of the feeds to PHPFina work for you do you think?

I may try that to see the size it takes.
Older energy feeds started at 2 second post then later 5secs.
Also started to log temperatures at 10 secs but now stick to 60s. Same feeds.
Maybe the flexibility of having variable time stamps is beter for my usecase.