Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update PINNOptimizers Benchmarks #1148

Merged
merged 8 commits into from
Jan 20, 2025
Merged
16 changes: 8 additions & 8 deletions benchmarks/PINNOptimizers/1d_diffusion.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ science-guided AI techniques.
## Setup

```julia
using NeuralPDE, OptimizationFlux, ModelingToolkit, Optimization, OptimizationOptimJL
using Lux, Plots
using NeuralPDE, ModelingToolkit, Optimization, OptimizationOptimJL
using Lux, Plots, OptimizationOptimisers
import ModelingToolkit: Interval, infimum, supremum
```

Expand Down Expand Up @@ -77,12 +77,12 @@ end
```

```julia
opt1 = ADAM()
opt2 = ADAM(0.005)
opt3 = ADAM(0.05)
opt4 = RMSProp()
opt5 = RMSProp(0.005)
opt6 = RMSProp(0.05)
opt1 = Optimisers.ADAM()
opt2 = Optimisers.ADAM(0.005)
opt3 = Optimisers.ADAM(0.05)
opt4 = Optimisers.RMSProp()
opt5 = Optimisers.RMSProp(0.005)
opt6 = Optimisers.RMSProp(0.05)
opt7 = OptimizationOptimJL.BFGS()
opt8 = OptimizationOptimJL.LBFGS()
```
Expand Down
16 changes: 8 additions & 8 deletions benchmarks/PINNOptimizers/1d_poisson_nernst_planck.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ science-guided AI techniques.
## Setup

```julia
using NeuralPDE, OptimizationFlux, ModelingToolkit, Optimization, OptimizationOptimJL
using Lux, Plots
using NeuralPDE, ModelingToolkit, Optimization, OptimizationOptimJL
using Lux, Plots, OptimizationOptimisers
import ModelingToolkit: Interval, infimum, supremum
```

Expand Down Expand Up @@ -158,12 +158,12 @@ end
```

```julia
opt1 = ADAM()
opt2 = ADAM(0.005)
opt3 = ADAM(0.05)
opt4 = RMSProp()
opt5 = RMSProp(0.005)
opt6 = RMSProp(0.05)
opt1 = Optimisers.ADAM()
opt2 = Optimisers.ADAM(0.005)
opt3 = Optimisers.ADAM(0.05)
opt4 = Optimisers.RMSProp()
opt5 = Optimisers.RMSProp(0.005)
opt6 = Optimisers.RMSProp(0.05)
opt7 = OptimizationOptimJL.BFGS()
opt8 = OptimizationOptimJL.LBFGS()
```
Expand Down
Loading
Loading