Hi team.
I’m running into a memory allocation error when trying to fit an SMA model. Specifically, the error says ‘loggerwriter.py write 10 Unable to allocate 43.8 GiB for an array with shape (5885000015,) and data type float64’
I’m unsure how to proceed or if it is a memory problem or a system problem. I attached the match configuration, h5 and csv file (from a simulation). As always, thanks for your help!
05CV_Simulation.csv (38.4 KB) LGE_05CV_native.h5 (147.8 KB) match_SMA_Simulation.json (926 Bytes)
I found the problem. It has to do with how the smoothing works. It needs uniform time distance between measurements so it resamples the entire list to the shortest time interval found. With this data set that is a huge problem because the CSV initially has points 1e-6 apart goes up to 1e3 seconds which gives 1e9 points and that uses FAR too much memory.
The simplest way to fix this is just to run some reasonable sample rate for the data you at matching against. Typically this is around 1 sample/second.
Normally data like this is generated synthetically since getting a million samples/second would be very hard for equipment to do. What I would recommend is to just set user_solution_times = numpy.linspace(0, 5885, 5886) and generate the CSV file again.