I have a set of data that I have been attempting to fit. An intermediate part of obtaining model predictions involves solving a differential equation over time using a numerical method. I have attempted fitting the data with a least squares method in Matlab. It gets a solution in a reasonable amount of time; however, the challenge seems to be that the optimized parameters are almost always close to the initial inputs. So, it appears there are many local minimums, and I’ve tried using a wide range of initial inputs in search of a potential global minimum.

I have also tried doing Bayesian inference with theano in a Jupyter Notebook. However, since I need to solve the differential equation using a numerical method (i.e. marching from one time step to the next), it appears that the for loop that loops through each step in time runs very slowly when operating on tensor variables. At least, it appears that for loops and tensor variables are a bad combination since just calling the model function with floats as inputs runs quite quickly.

I’m sort of stuck now though. It doesn’t seem that I can vectorize the differential equation since the solution at each time point depends on the solution at the previous time point. So, theano seems to be too slow; although, I would like to use that method if possible. Are there any good functions or programs that you would recommend for fitting data to a model equation?

Source: Python Questions