The PINN (Physics-Informed Neural Network) solver provides an alternative to the finite-difference level set method. Instead of discretizing the PDE on a grid, a neural network \(\varphi_\theta(x, y, t)\) is trained to satisfy the Hamilton-Jacobi fire spread equation:
where \(\tau(t) = (t - t_{\min}) / (t_{\max} - t_{\min})\) is zero at \(t = t_{\min}\), ensuring \(\tilde\varphi(x,y,t_{\min}) = \text{IC}(x,y)/L\) exactly. This eliminates the need for an IC loss term and lets all training focus on learning the PDE dynamics.
Advantages
Mesh-free: Evaluate \(\varphi\) at any continuous \((x, y, t)\) – no grid interpolation needed
Continuous in time: Query the fire state at any \(t\) without stepping through intermediate time steps
Exact initial condition: Hard constraint guarantees perfect IC fit
Data assimilation ready: Optional observation loss term for incorporating satellite/sensor data
Requirements
The PINN solver is a package extension – it only loads when the ML dependencies are available:
The PINN solution can be compared against the standard finite-difference solver. Note that PINNs are an approximate method – accuracy improves with larger networks, more collocation points, and longer training:
# Finite-difference reference with constant F = 5.0 m/mingrid_fd =LevelSetGrid(20, 20, dx=50.0)ignite!(grid_fd, 500.0, 500.0, 80.0)F =fill(5.0, size(grid_fd))for _ in1:20advance!(grid_fd, F, 0.5)end# PINN prediction at the same timegrid_pinn =LevelSetGrid(20, 20, dx=50.0)predict_on_grid!(grid_pinn, sol, 10.0)fig =Figure(size=(700, 300))ax1 =Axis(fig[1, 1], title="Finite Differences (t=10)", aspect=DataAspect())fireplot!(ax1, grid_fd)hidedecorations!(ax1)ax2 =Axis(fig[1, 2], title="PINN (t=10)", aspect=DataAspect())fireplot!(ax2, grid_pinn)hidedecorations!(ax2)fig
PDE residual (\(\mathcal{L}_{\text{pde}}\)): Enforces the Hamilton-Jacobi equation at random interior points
Boundary condition (\(\mathcal{L}_{\text{bc}}\)): Penalizes burned regions (\(\varphi < 0\)) at the domain boundary
Data loss (\(\mathcal{L}_{\text{data}}\)): Optional – fits to observed fire perimeter data
The initial condition is enforced exactly through the hard constraint decomposition (no IC loss term needed).
Data Assimilation
Pass observations as a tuple of vectors (t, x, y, phi) to incorporate fire perimeter data:
# Example: observations at t=10 along a known fire boundaryobs_t =fill(10.0, 50)obs_x =rand(400.0:800.0, 50)obs_y =rand(400.0:800.0, 50)obs_phi =zeros(50) # on the fire front (phi = 0)sol =train_pinn(grid, model, (0.0, 20.0); observations=(obs_t, obs_x, obs_y, obs_phi))
References
Raissi, M., Perdikaris, P., & Karniadakis, G. E. (2019). Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378, 686-707.
Osher, S. & Sethian, J.A. (1988). Fronts propagating with curvature-dependent speed. J. Computational Physics, 79(1), 12-49.