Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DifferentialEquations Roadmap #47

Closed
finmod opened this issue Sep 11, 2016 · 15 comments
Closed

DifferentialEquations Roadmap #47

finmod opened this issue Sep 11, 2016 · 15 comments

Comments

@finmod
Copy link

finmod commented Sep 11, 2016

DifferentialEquations is dealing with the forward problem in Julia: predicting the model behaviour for all types of DEs and all types of solvers.

Are you planning to go into the direction of the Inverse Problem: inferring the models parameters from data. A hint in this direction can be found here: http://www2.warwick.ac.uk/fac/sci/statistics/staff/academic-research/girolami/podes/ . Coming from Matlab, it should translate well to Julia.

In this area, there is also the work on automatic differentiation with wrappers in python as in here: http://www.bmp.ds.mpg.de/pyodefit.html which could benefit hugely from treatment in Julia.

Thank you.

@ChrisRackauckas
Copy link
Member

Yeah, I plan on handling both of these. From what I see you included two slightly different things, so let me handle those separately:


Uncertainty quantification

For the problem of uncertainty, I already have the SDE suite and the functions for doing multi-node Monte Carlo that go with it. You can map problems with random parameters to random dynamical systems which can be mapped to stochastic differential equations under some assumptions. Probability distributions can be estimated from those, or by solving the Forward Kolmogorov equations, which are simply semilinear Heat Equations which I plan on extensively covering with an FDM part of the PDE suite soon. Other methods, like an MCMC-based method, could be written as a different dispatch for handling the same kind of thing. So the tools for doing all of this will be built up in some time, and when that occurs I can stick them all together as another feature.

On the same idea, uncertainty analysis can also be done using interval arithmetic. Since DifferentialEquations.jl works with ArbFloats, it will work with ArbReals. I'm working with the creator of ArbReals to make this a reality: be able to use interval arithmetic to get an interval for the solution at each timepoint. While this won't handle all cases (it wouldn't handle solutions that to N different possible solutions, it would just give a large interval), it would be an interesting feature (and then a plot recipe would be easy to make which would make a ribbon plot to show the uncertainty. In fact, adding this would be like 2-3 lines of code).


Parameter Inference

The other idea is parameter inference. I have a lot of plans for this. I am actually working on a few projects / algorithms myself for this. However, for this to have the stability that a library needs, a few things need to change. First, an API has to be in place for parameters to be explicit in the model. This is already being discussed in #41, and more broadly by SciML/Roadmap#2 (the discussion for this has been in the chatrooms). Even if other libraries aren't going forward with this, there are clear reasons why I have to: compatibility for parameter inference, sensitivity analysis, bifurcation diagrams, etc. So in #41 I was exploring methods for enclosing parameters so that if parameters are not used, it's a no-op (no operation), and if passed parameters are used, they would be essentially free. I found a way to do so, so now it's just about implementing it. Then all functions will be allowed to be (t,u,du,p) where p is a type for holding the parameters.

One parameters are explicit like that, then you can easily "jiggle the parameters". So sensitivity analysis will basically be free just by using ForwardDiff to take the parameter derivatives along the solution. The API will be pretty simple: it's just specify the fields of p which you want to do this with, and at every timepoint which this is saved it can be done. This will be done via keywords to solve, and will just be an addon that you can ask for. It can get more in depth, but that's the basic idea. It's clear how bifurcation diagrams can be made, and that can be a bifurcate(prob,bifparmeter=:param) where :param is the pa parameter of interest (other things will need to get passed, like intervals to look for solutions on etc, then it can do arclength continuation, but you get the picture).

Tackling parameter inference is then about linking to machine learning suites. The reason why I have this last on my mind, even though I already have prototypes, is because this will need to link into proper machine learning suites and so I would like to wait until JuliaML/Optim matures a little. But the idea is that you need to have a model function (with explicit parameters), a cost function, and an ODE solver. It then iteratively solves the ODE with given parameters trying to minimize the cost function using whatever ML algorithm is chosen. As you can see, this means that there really isn't much left to do other than to make a infer(prob::ODEProblem,cfun::Function;mlag=:MinimizerAlg,odealg=:DP5) where you're cfun is a cost function on the solution, parameters, and some data, i.e. minimizing the mean-square error from the data while keeping the L2 norm of the parameters low (regularized cost function). All of the tools are in place to just call the minimizing function, so this isn't difficult, it just needs to be done.


Conclusion

So yeah, the conclusion is that I'm heading there. The ODEProblem is designed to have the information of the problem in such a way that parameter inference, sensitivity analysis, bifurcation diagrams, conversions to SDEs, etc. are all possible with the same object you'd give to the ODE solver. Right now I'm just rounding out the ODE solver and going to make (S)PDE solvers for the equations that tend to show up everywhere in these, and then the last step will just be piecing them all together.

If bloat begins to be a problem at all, some of these can spawn out to be separate libraries, i.e. using ODEParameterInference would import the right things for the infer function to work, but since it would work directly on the ODEProblem type, there shouldn't be much more the user has to do.

This is my plan at least. With the dense output for ODEs pretty much done, I'll start optimizing the library a bit to make the benchmarks do even better. The native non-stiff solvers are already the most efficient you can find according to the tests, so then I want to round out the wrappers for Sundials/ODEInterface for dense output so that way their non-stiff solvers are fully featured. Then any day now my paper on adaptive SDE algorithms should make it through peer review and be published, at which point I can flip the switch and add that to the SDE suite. Then I have another paper going out on new higher order Runge-Kutta methods for SDEs to further improve the efficiency/stability there. With the ODE/SDE libraries fully featured and benchmarked as highly efficient, the rest of this is tagging on features which is easy to do without cluttering things up via dispatch.

TL;DR: I'm working on it.

@nantonel
Copy link

Hi all,

what about optimal control? I haven't look at the package properly -so maybe I've missed that- but it seems to me that there is no input, but just initial conditions in the ODEs.

an example of code that deals with optimal control and automatic differentiation is casadi: https://github.com/casadi/casadi/wiki

Niccolo'

@ChrisRackauckas ChrisRackauckas changed the title Will DifferentialEquations deal with the Inverse Problem? DifferentialEquations Roadmap Sep 12, 2016
@ChrisRackauckas
Copy link
Member

I don't currently have optimal control on the list, mostly because right now I don't know of a way to get academic credit for it. I would need to find the right project / know that there are enough people interested in it to drive up the repository stars (or some other metric of usage) in order to convince my adviser / any future funding committees that it's worthwhile. All of the other things I have mentioned line up with numerical methods research that I am doing, or systems biology applications, all leading to publications while developing tools I know are of great interest. That's not to say optimal control isn't something that can be on the list, there are definitely a lot of systems biology applications and numerical methods to be done, but I haven't ran into one yet.

However, to have it built into the system, it would just require that the parameters can be functions, which will be possible with the suggested change of #11. Then special solvers would need to use that fact. So with this design it's definitely not difficult to make solvers which act on the same types to do optimal control, but it's not where I am going next unless I find out I really need it (but if anyone is really interested in implementing it, we should discuss).

@finmod
Copy link
Author

finmod commented Sep 13, 2016

Thank you for this comprehensive reply. I guess that the best posture is to keep abreast of updates as they become available.

Concerning uncertainty quantification, it would be good to stay with the examples of other researchers in such a way as they can become benchmarks across different packages. On the inverse problem, a good overview of the various dimensions DifferentialEquations should achieve is given in here: http://www.birs.ca/events/2013/5-day-workshops/13w5151/videos/watch/201311140905-Campbell.html http://www.birs.ca/events/2013/5-day-workshops/13w5151/videos/watch/201311140905-Campbell.html . Some packages may already be familiar to you like LeastSquaresOptim and the conversion of PODES from Matlab to Julia should be a breathe to you.

Concerning optimal control, it comes indirectly in your project. A look at the review of statistical inference for dynamical systems here: https://arxiv.org/abs/1204.6265 https://arxiv.org/abs/1204.6265 indicates that the optimal control solution as in http://www.physik3.gwdg.de/~ulli/pdf/SBP11_preprint.pdf http://www.physik3.gwdg.de/~ulli/pdf/SBP11_preprint.pdf was the best available in automatic differentiation for quite some time. It was used in biophysics. Unfortunately, as we have experienced over the last two days, bringing all of these methods under one single umbrella in Julia is much needed to avoid layers of wrappers. For instance, trying to set up pyodefit and pysparseLM of Schumann-Bischoff and Parliz is hell. Julia appears to be more congenial and almost equally fast as C.

Finally, concerning machine learning, there is this week a GP summer school in Sheffield, England on uncertainty quantification. In there, keep watching here https://github.com/SheffieldML/GPyOpt https://github.com/SheffieldML/GPyOpt where Javier Gonzalez will come up shortly with the PODES examples in python uncertainty quantification. Of course, it is not Julia but it is in-place calculations in python.

Keep up the good work.

From: Christopher Rackauckas [mailto:[email protected]]
Sent: Monday, September 12, 2016 1:07 AM
To: JuliaDiffEq/DifferentialEquations.jl [email protected]
Cc: finmod [email protected]; Author [email protected]
Subject: Re: [JuliaDiffEq/DifferentialEquations.jl] Will DifferentialEquations deal with the Inverse Problem? (#47)

Yeah, I plan on handling both of these. From what I see you included two slightly different things, so let me handle those separately:


Uncertainty quantification

For the problem of uncertainty, I already have the SDE suite and the functions for doing multi-node Monte Carlo that go with it. You can map problems with random parameters to random dynamical systems which can be mapped to stochastic differential equations under some assumptions. Probability distributions can be estimated from those, or by solving the Forward Kolmogorov equations, which are simply semilinear Heat Equations which I plan on extensively covering with an FDM part of the PDE suite soon. Other methods, like an MCMC-based method, could be written as a different dispatch for handling the same kind of thing. So the tools for doing all of this will be built up in some time, and when that occurs I can stick them all together as another feature.

On the same idea, uncertainty analysis can also be done using interval arithmetic. Since DifferentialEquations.jl works with ArbFloats, it will work with ArbReals. I'm working with the creator of ArbReals to make this a reality: be able to use interval arithmetic to get an interval for the solution at each timepoint. While this won't handle all cases (it wouldn't handle solutions that to N different possible solutions, it would just give a large interval), it would be an interesting feature (and then a plot recipe would be easy to make which would make a ribbon plot to show the uncertainty. In fact, adding this would be like 2-3 lines of code).


Parameter Inference

The other idea is parameter inference. I have a lot of plans for this. I am actually working on a few projects / algorithms myself for this. However, for this to have the stability that a library needs, a few things need to change. First, an API has to be in place for parameters to be explicit in the model. This is already being discussed in #41 #41 , and more broadly by SciML/Roadmap#2 SciML/Roadmap#2 (the discussion for this has been in the chatrooms). Even if other libraries aren't going forward with this, there are clear reasons why I have to: compatibility for parameter inference, sensitivity analysis, bifurcation diagrams, etc. So in #41 #41 I was exploring methods for enclosing parameters so that if parameters are not used, it's a no-op (no operation), and if passed parameters are used, they would be essentially free. I found a way to do so, so now it's just about implementing it. Then all functions will be allowed to be (t,u,du,p) where p is a type for holding the parameters.

One parameters are explicit like that, then you can easily "jiggle the parameters". So sensitivity analysis will basically be free just by using ForwardDiff to take the parameter derivatives along the solution. The API will be pretty simple: it's just specify the fields of p which you want to do this with, and at every timepoint which this is saved it can be done. This will be done via keywords to solve, and will just be an addon that you can ask for. It can get more in depth, but that's the basic idea. It's clear how bifurcation diagrams can be made, and that can be a bifurcate(prob,bifparmeter=:param) where :param is the pa parameter of interest (other things will need to get passed, like intervals to look for solutions on etc, then it can do arclength continuation, but you get the picture).

Tackling parameter inference is then about linking to machine learning suites. The reason why I have this last on my mind, even though I already have prototypes, is because this will need to link into proper machine learning suites and so I would like to wait until JuliaML/Optim matures a little. But the idea is that you need to have a model function (with explicit parameters), a cost function, and an ODE solver. It then iteratively solves the ODE with given parameters trying to minimize the cost function using whatever ML algorithm is chosen. As you can see, this means that there really isn't much left to do other than to make a infer(prob::ODEProblem,cfun::Function;mlag=:MinimizerAlg,odealg=:DP5) where you're cfun is a cost function on the solution, parameters, and some data, i.e. minimizing the mean-square error from the data while keeping the L2 norm of the parameters low (regularized cost function). All of the tools are in place to just call the minimizing function, so this isn't difficult, it just needs to be done.


Conclusion

So yeah, the conclusion is that I'm heading there. The ODEProblem is designed to have the information of the problem in such a way that parameter inference, sensitivity analysis, bifurcation diagrams, conversions to SDEs, etc. are all possible with the same object you'd give to the ODE solver. Right now I'm just rounding out the ODE solver and going to make (S)PDE solvers for the equations that tend to show up everywhere in these, and then the last step will just be piecing them all together.

If bloat begins to be a problem at all, some of these can spawn out to be separate libraries, i.e. using ODEParameterInference would import the right things for the infer function to work, but since it would work directly on the ODEProblem type, there shouldn't be much more the user has to do.

This is my plan at least. With the dense output for ODEs pretty much done, I'll start optimizing the library a bit to make the benchmarks do even better. The native non-stiff solvers are already the most efficient you can find according to the tests, so then I want to round out the wrappers for Sundials/ODEInterface for dense output so that way their non-stiff solvers are fully featured. Then any day now my paper on adaptive SDE algorithms should make it through peer review and be published, at which point I can flip the switch and add that to the SDE suite. Then I have another paper going out on new higher order Runge-Kutta methods for SDEs to further improve the efficiency/stability there. With the ODE/SDE libraries fully featured and benchmarked as highly efficient, the rest of this is tagging on features which is easy to do without cluttering things up via dispatch.

TL;DR: I'm working on it.


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub #47 (comment) , or mute the thread https://github.com/notifications/unsubscribe-auth/AMHyIosK6D3aRxI7yYUbmdsRFgKohvNpks5qpJe5gaJpZM4J6Ff1 . https://github.com/notifications/beacon/AMHyIiiZk60JsIG7pv2xlGT2Es8MK0tGks5qpJe5gaJpZM4J6Ff1.gif

@ChrisRackauckas
Copy link
Member

Unfortunately, as we have experienced over the last two days, bringing all of these methods under one single umbrella in Julia is much needed to avoid layers of wrappers.

Well, the native Julia packages all seem to work on first-go. The only issue there is that I am wrapping some things that are unreleased, like the JuliaODE ODE.jl development branch. It will require no effort once that's all on the main release for ODE.jl (a month or so off). For example, I already use the autodifferentiation from ForwardDiff and use it inside of NLsolve for the implicit methods (for example, Rosenbrock). I haven't seen an issue filed there, so that seems to have worked seamlessly. Python packages which are well-wrapped like PyPlot tend to work seamlessly as well.

What isn't seamless and what will never be seamless is something like ODEInterface because something can always go wrong when trying to compile Fortran (though Sundials has done well to make its binary dependencies easy to use). I think Sundials on Windows actually just ships a precompiled binary to stop this issue, since compilers on Windows can be a mess. But this should be easier on Linux: I haven't had these troubles there. But again, relying on C/Fortran will lead to these platform specific troubles, and should be done as necessary, moreso to create a benchmark for performance than as something to tell users to rely on (though early on, they will need to in some places, like stiff ODEs).

That said there are plenty of major advantages that we could get by rolling native Julia implementations. A lot of these things cannot be vectorized well and must use loops, which is one major reason for my switch to Julia (example: stochastic differential equations in the general sense cannot be vectorized in a MATLAB friendly way, and even if they could, that would just create a ton of temporaries and be slow compared to C/Fortran. In this list, anything dealing with optimization is a place where vectorization usually a no-go). With everything native, I can plug in to things like Optim.jl for a well-optimized Levenberg-Marquardt (I already use NLsolve, so if you're using DifferentialEquations.jl you already have Optim.jl silently installed and working). Also, due to putting together these dense output algorithms, I have plenty of high-order interpolation methods to choose from. So it would be a breeze to make this implementation faster and more accurate, whereas in pyodefit they from scratch re-make everything they need, and it's clear that some of the functions they made are not optimal for performance. It's more about getting it done.

@ChrisRackauckas
Copy link
Member

Something I should state is, if anyone is willing to help push in some direction, just let me know if you want help. It's not difficult to build the wrapper over different algorithms which sets up types and all like is done in all of the solve dispatches. So if you're interested in a bifurcationplot or parameter_estimation function which takes in an ODEProblem, puts it into some algorithm, and builds the solution, we should talk about the API and I can help put together a shell so that way it's as easy to implement as plugging in algorithms.

Also, if you know of ways to get academic credit / funding for any of this, let me know. Right now what drives most of the work is what new methods I'm developing for publications (the SDE library) or for software publications (the ODE suite). If I can find a good way to get publications for doing things like putting together the parameter estimation algorithms, optimizing(/parallelizing) them, and benchmarking them, then that development time is much easier find (I know working academics/scientists are interested in this, but putting these as a bunch of arxiv preprints can't make a career). Otherwise for any credit I need to develop my own algorithm and then write a paper around that, and the other algorithms will get implemented because I will need them to be benchmarked in the paper (which then feature-completes the suite). So when I say parameter estimation is on the list, I have some ideas/projects in that direction, whereas in optimal control I can do it and probably optimize it better than previous implementations (if you haven't seen the non-stiff ODE benchmarks, check them out. They're great!), but I don't know of a way to get credit because I don't have a new idea. I have to bend a little bit in the direction of my overlords.

BTW, since it hasn't been mentioned here, there are some updates to the SDE library which are currently blocked by peer review (my adviser is adamant that I don't release any new algorithms before the paper is published, so there are fast adaptive methods for SDEs and new RK methods for SDEs which are implemented but on a private branch waiting to be released).

I'll just keep working. Julia is fun because once you run benchmarks and see that you have a "chance to win / are winning", you just have to spend all week going all out 👍 .

I'm undecided on , after finishing this round of polish on the ODE/SDE libraries, whether it's better to flesh out the basic components more first (add/optimize/parallelize/multithread Adams methods, BDF/NDF, TSRK methods, ...), or expand the reach first (add bifurcation plots, phase diagrams, parameter estimations, etc.) and come back to optimize the performance of them afterwards.

@ChrisRackauckas
Copy link
Member

Update: The timeline for parameter estimation will be simply to wait a little bit on that until JuliaML sorts itself out. See JuliaML/META#9.

@ahwillia
Copy link

@ChrisRackauckas, I'd really like to see the parameter estimation stuff happen. What do you need from us specifically?

@ChrisRackauckas
Copy link
Member

Just an easy way to define a cost function (with regularization) and optimize it. The simplest standard methods use nonlinear regression. For example, if you had time series data d(t), you'd just define a cost function which is g(theta) = \int ||u(t,theta) - d(t)||_2, where u(t,theta) is the numerical solution to the differential equation with parameters theta. So at each iteration you solve the ODE with parameters theta, determine the cost, and then pick new parameters according to whatever machine learning algorithm. With a machine learning framework where you can just stick in your own cost function, this should just work.

More advanced methods improve on this, but the basic idea is the same: the cost function just needs to call some kind of differential equation solver. Then a package for doing this just wraps this all so that you just specify:

  • the type of cost function (in terms of the numerical ODE solution)
  • the ODE solver algorithm
  • the machine learning / optimization algorithm

The most difficult part on the side of the differential equation solvers is having a continuous output so that way it lines up with the data (or however you want to do the cost function). I just finished that.

@ahwillia
Copy link

The only thing we'd need from you is the gradient of u(...) with respect to theta. Other than that, this can get done pretty soon I think!

@ChrisRackauckas
Copy link
Member

And that I can just grab using ForwardDiff.jl once I do the change for explicit parameters #41 / SciML/Roadmap#2. Yeah so it's not so difficult one all of the pieces are together.

@finmod
Copy link
Author

finmod commented Sep 17, 2016

Reacting to:

“I'm undecided on , after finishing this round of polish on the ODE/SDE libraries, whether it's better to flesh out the basic components more first (add/optimize/parallelize/multithread Adams methods, BDF/NDF, TSRK methods, ...), or expand the reach first (add bifurcation plots, phase diagrams, parameter estimations, etc.) and come back to optimize the performance of them afterwards.”

I would advise to focus on the reach first because this is where the potential contribution to the field is. The process is to gradually move away from numerical differentiation to dispensing with all forms of differentiation (numerical, symbolic, automatic) and moving to probabilistic or newer methods/algorithms like “rough path” and regularity structures as in Peter Friz and Martin Hairer. Refining and comparing basic components comes as a second step and you already have on the same bench a wide array of tools.

Concerning the financing, I came to Julia via the QuantEcon-net of Thomas Sargent and John Stachurski. Their experience shows that you need to develop a good product first reaching a broad population of applications and fields. Their objective is that the computing should not be in the way of economics.Then, you can apply for financing from the Alfred P. Sloan Foundation, as they have done, when the relevance and effectiveness to the various fields of DifferentialEquations is obvious to all. Cambridge England is also keen (Issac Newton which is part of Cambridge Math and Cambridge Engineering Dpt.).

From: Christopher Rackauckas [mailto:[email protected]]
Sent: Wednesday, September 14, 2016 2:57 AM
To: JuliaDiffEq/DifferentialEquations.jl [email protected]
Cc: finmod [email protected]; Author [email protected]
Subject: Re: [JuliaDiffEq/DifferentialEquations.jl] DifferentialEquations Roadmap (#47)

Something I should state is, if anyone is willing to help push in some direction, just let me know if you want help. It's not difficult to build the wrapper over different algorithms which sets up types and all like is done in all of the solve dispatches. So if you're interested in a bifurcationplot or parameter_estimation function which takes in an ODEProblem, puts it into some algorithm, and builds the solution, we should talk about the API and I can help put together a shell so that way it's as easy to implement as plugging in algorithms.

Also, if you know of ways to get academic credit / funding for any of this, let me know. Right now what drives most of the work is what new methods I'm developing for publications (the SDE library) or for software publications (the ODE suite). If I can find a good way to get publications for doing things like putting together the parameter estimation algorithms, optimizing(/parallelizing) them, and benchmarking them, then that development time is much easier find (I know working academics/scientists are interested in this, but putting these as a bunch of arxiv preprints can't make a career). Otherwise for any credit I need to develop my own algorithm and then write a paper around that, and the other algorithms will get implemented because I will need them to be benchmarked in the paper (which then feature-completes the suite). So when I say parameter estimation is on the list, I have some ideas/projects in that direction, whereas in optimal control I can do it and probably optimize it better than previous implementations (if you haven't seen the non-stiff ODE benchmarks, check them out. They're great!) https://github.com/JuliaDiffEq/DifferentialEquations.jl/tree/master/benchmarks , but I don't know of a way to get credit because I don't have a new idea. I have to bend a little bit in the direction of my overlords.

BTW, since it hasn't been mentioned here, there are some updates to the SDE library which are currently blocked by peer review (my adviser is adamant that I don't release any new algorithms before the paper is published, so there are fast adaptive methods for SDEs and new RK methods for SDEs which are implemented but on a private branch waiting to be released).

I'll just keep working. Julia is fun because once you run benchmarks and see that you have a "chance to win / are winning", you just have to spend all week going all out 👍 .

I'm undecided on , after finishing this round of polish on the ODE/SDE libraries, whether it's better to flesh out the basic components more first (add/optimize/parallelize/multithread Adams methods, BDF/NDF, TSRK methods, ...), or expand the reach first (add bifurcation plots, phase diagrams, parameter estimations, etc.) and come back to optimize the performance of them afterwards.


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub #47 (comment) , or mute the thread https://github.com/notifications/unsubscribe-auth/AMHyIvT1yi4TwFjPLQuFTYlMBsdN2-5bks5qp1RpgaJpZM4J6Ff1 . https://github.com/notifications/beacon/AMHyIoEp6F3ahPv2Nloa-PSMZG5HpMdPks5qp1RpgaJpZM4J6Ff1.gif

@ChrisRackauckas
Copy link
Member

Updates:

  • A lot of these newer features rest on ParameterizedFunctions.jl. Not only do their symbolically-calculated Jacobians and the like really help speed up computations, they provide an easy interface to implement the following things in an easy and performant way.
  • ParameterEstimation.jl builds objective functions for you to solve with Optim.jl. Docs are already written but they won't be added until the modularization change gets pushed to master.
  • Bifurcate.jl is an interface to PyDSTool.jl which does a bifurcation analysis. PyDSTool is fully functional, Bifurcate.jl will get some polished and will be documented as well.
  • I implemented rudimentary sensitivity analysis, but it wasn't good... I am re-doing the implementation in Sensitivity.jl.

Lastly, if someone could help me find out how to use Mads.jl, then I will be adding its functionality to ParameterEstimation.jl and Sensitivity.jl. As seen with the other parts, the ParameterizedFunctions are general enough that they should be able to just write the Mads model for you, and so there will be wrappers to Mads.jl where instead of doing all of that crazy defining of .mads files and the like, it will write all the extra things so all you as a user have to do is pass a ParameterizedFunction and some solver options. However, to get this implemented, the only hard part is finding out how to use Mads.jl. I made contact: see madsjulia/Mads.jl#7 . Using Mads.jl will give us more ParameterEstimation functions, local/global sensitivity analysis, and functionality for optimal control and uncertainty quantification. I think having this directly accessible through DifferentialEquations.jl/ParameterizedFunctions.jl in a way that's almost transparent to the user will be a major win, and using Mads.jl will help make sure I don't have to re-invent the wheel here.

@ChrisRackauckas
Copy link
Member

Another change people might want to be aware of is the implementation of "Models" packages:

  • FinancialModels.jl
  • MultiScaleModels.jl

I plan on doing a SystemsBiologyModels.jl, and then it can be expanded however people wish. These just have a bunch of helpers for defining common problems within a domain. For example, you could define a BlackScholesProblem just by giving a constructor a few parameters, and then this problem can be solved using the SDE solvers, or when available it can just directly use the ParameterEstimation tools, etc. If there's anywhere else to expand this idea to, let me know. It's relatively easy to code up some models for this: but I need to document it a little more and register/tag it. MultiScaleModels.jl is a really beauty, but it's hard to explain. However again, at its core, it also just builds ODEs/SDEs/Hybrid models.

@ChrisRackauckas
Copy link
Member

Okay, it looks like I have the whole pipeline together. So the DiffEq ecosystem supports parameter estimation already, here's the docs for that, and it can be expanded. The method for doing add-ons like this is highly extendable, and I'm doing this for optimal control, bifurcation plotting, sensitivity analysis, and uncertainty quantification. I think that means I can close this: it's a definitive "yes we support it!".

However, there's a long way to go. Please open more targeted issues (like this one: #100) for algorithms to implement and equations to support.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants