API
Types
AbstractPINNConfig
AbstractPINNConfigSupertype for PINN training configurations. Subtypes select the solver backend:
PINNConfig— custom Lux solver with hard IC constraintNeuralPDEConfig— NeuralPDE.jl / ModelingToolkit solver
NeuralPDEConfig
NeuralPDEConfig(; kwargs...)Training hyperparameters for the NeuralPDE.jl PINN solver.
Uses ModelingToolkit symbolic PDE definition with PhysicsInformedNN discretization. IC and BC are enforced as soft constraints (loss terms).
Requires NeuralPDE and ModelingToolkit to be loaded (triggers package extension).
Keyword Arguments
hidden_dims::Vector{Int}- Hidden layer sizes (default[16, 16])activation::Symbol- Activation function (default:σ)strategy::Symbol- Training strategy::grid,:stochastic(default:grid)grid_step::Float64- Grid spacing forGridTraining(default0.1)max_epochs::Int- Maximum training iterations (default1000)optimizer::Symbol- Optimizer::lbfgs,:bfgs(default:lbfgs)learning_rate::Float64- Learning rate for Adam (default1e-2)
Examples
config = NeuralPDEConfig(hidden_dims=[32, 32], max_epochs=2000)
sol = train_pinn(grid, model, tspan; config=config)PINNConfig
PINNConfig(; kwargs...)Training hyperparameters for the custom Lux-based PINN solver.
The initial condition is enforced exactly via a hard constraint decomposition (no IC loss term or IC collocation points needed).
Keyword Arguments
hidden_dims::Vector{Int}- Hidden layer sizes (default[64, 64, 64])activation::Symbol- Activation function (default:tanh)n_interior::Int- PDE collocation points (default5000)n_boundary::Int- Boundary condition points (default500)lambda_pde::Float64- PDE loss weight (default1.0)lambda_bc::Float64- BC loss weight (default1.0)lambda_data::Float64- Data loss weight (default1.0)learning_rate::Float64- Adam learning rate (default1e-3)max_epochs::Int- Maximum training epochs (default10000)resample_every::Int- Resample collocation points every N epochs (default500)lbfgs_epochs::Int- L-BFGS refinement epochs after Adam (default0, disabled)importance_sampling::Bool- Concentrate points near fire front (defaultfalse)float32::Bool- Use Float32 for NN weights (defaultfalse)
Examples
config = PINNConfig(hidden_dims=[128, 128], max_epochs=10000)PINNSolution
PINNSolutionTrained PINN model. Callable as sol(t, x, y) to evaluate the level set function.
Fields
model- Neural network or callable evaluatorparameters- Trained parametersstate- Model state (backend-specific)config::AbstractPINNConfig- Training configurationloss_history::Vector{Float64}- Loss at each epochdomain::NamedTuple-(tspan, xspan, yspan, phi_scale)for input normalizationgrid_ic- Initial condition grid
Examples
phi = sol(10.0, 500.0, 500.0) # evaluate at t=10, x=500, y=500Functions
firegif
firegif(path, trace::Trace, grid::LevelSetGrid; residence_time=nothing, framerate=15, frontcolor=:black, frontlinewidth=2.0)Create an animated GIF of fire spread from a Trace recorded during simulate!.
Uses the same visualization style as fireplot!.
Requires Makie (or a backend like CairoMakie / GLMakie) to be loaded.
Examples
using CairoMakie
grid = LevelSetGrid(200, 200, dx=30.0)
ignite!(grid, 3000.0, 3000.0, 50.0)
trace = Trace(grid, 5)
simulate!(grid, model, steps=100, trace=trace)
firegif("fire.gif", trace, grid)
firegif("fire.gif", trace, grid; residence_time=0.005)fireplot
fireplot(grid::LevelSetGrid; residence_time=nothing, frontcolor=:black, frontlinewidth=2.0)Plot a LevelSetGrid as a heatmap with the fire front (φ = 0) overlaid as a contour line. Returns a Makie.Figure.
When residence_time is provided, uses a burnout-aware colormap where burned cells transition from yellow (just ignited) → red → black (burnt out) and unburned cells transition from white (near front) → green (far away).
Without residence_time, falls back to a symmetric φ heatmap with :RdYlGn colormap.
Requires Makie (or a backend like CairoMakie / GLMakie) to be loaded.
Examples
using CairoMakie
grid = LevelSetGrid(100, 100, dx=30.0)
ignite!(grid, 1500.0, 1500.0, 100.0)
fireplot(grid)
fireplot(grid; residence_time=0.005)fireplot! {#fireplot!}
fireplot!(ax, grid::LevelSetGrid; residence_time=nothing, frontcolor=:black, frontlinewidth=2.0)In-place version of fireplot: draws into an existing Axis.
Requires Makie (or a backend like CairoMakie / GLMakie) to be loaded.
predict_on_grid
predict_on_grid(sol::PINNSolution, grid::LevelSet.LevelSetGrid, t)Evaluate the trained PINN on every cell center of grid at time t. Returns a matrix of phi values with the same dimensions as grid.
predict_on_grid! {#predict_on_grid!}
predict_on_grid!(grid::LevelSet.LevelSetGrid, sol::PINNSolution, t)In-place version of predict_on_grid: updates grid.phi and grid.t.
train_pinn
train_pinn(grid, model, tspan; config=PINNConfig(), ...)
train_pinn(grid, model, tspan, config; ...)Train a Physics-Informed Neural Network to solve the fire spread level set PDE.
The PINN learns a function phi_theta(x, y, t) satisfying:
dphi/dt + F(x,y,t)|nabla phi| = 0where F is the spread rate from the FireSpreadModel.
The solver backend is selected by the config type:
PINNConfig— custom Lux solver with hard IC constraint (requiresLux)NeuralPDEConfig— NeuralPDE.jl symbolic solver (requiresNeuralPDE)
Arguments
grid-LevelSetGridproviding domain geometry and initial conditionmodel- Callablemodel(t, x, y) -> spread_rate(e.g.FireSpreadModel)tspan- Time interval(t_start, t_end)config-PINNConfigorNeuralPDEConfigwith training hyperparametersobservations- Optional(t, x, y, phi)tuple of observation data (Lux backend only)lbfgs_optimizer- Optimizer for L-BFGS refinement phase, e.g.OptimizationOptimJL.LBFGS()(Lux backend only, requireslbfgs_epochs > 0in config)
Returns
A PINNSolution callable as sol(t, x, y).
Examples
# Custom Lux backend (default)
sol = train_pinn(grid, model, (0.0, 50.0))
# NeuralPDE backend
sol = train_pinn(grid, model, (0.0, 50.0); config=NeuralPDEConfig())