Optimizing parameters in a WOFOST crop model using diffWOFOST
This Jupyter notebook demonstrates the optimization of parameters in a
differentiable model using the diffwofost package. The package provides
differentiable implementations of the WOFOST model and its associated
sub-models. As diffwofost is under active development, this notebook focuses on
two sub-models: leaf_dynamics and root_dynamics.
To enable these models to operate independently, certain state variables required by the model are supplied as "external states" derived from the test data. Also, at this stage, only a limited subset of model parameters has been made differentiable.
The notebook is organized into two standalone sections that can be executed independently:
- Leaf Dynamics
- Root Dynamics
1. Leaf dynamics¶
In this section, we will demonstrate how to optimize two parameters TWDI and SPAN in
leaf_dynamics model using a differentiable version of leaf_dynamics.
The optimization will be done using the Adam optimizer from torch.optim.
1.1 software requirements¶
To run this notebook, we need to install the diffwofost; the differentiable
version of WOFOST models. Since the package is constantly under development, make
sure you have the latest version of diffwofost installed in your
python environment. You can install it using pip:
# install diffwofost
!pip install diffwofost
# ---- import libraries ----
import copy
import torch
import numpy
import yaml
from pathlib import Path
from diffwofost.physical_models.utils import EngineTestHelper
from diffwofost.physical_models.utils import prepare_engine_input
# ---- disable a warning: this will be fixed in the future ----
import warnings
warnings.filterwarnings("ignore", message="To copy construct from a tensor.*")
1.2. Data¶
A test dataset of LAI (Leaf area index, including stem and pod area) and
TWLV (Dry weight of total leaves (living + dead)) will be used to optimize
parametesr TWDI (total initial dry weight) and SPAN (life span of leaves).
Note that in leaf_dynamic, changes in SPAN dont affect TWLV.
The data is stored in PCSE tests folder, and can be doewnloded from PCSE repsository.
You can select any of the files related to leaf_dynamics model with a file name that follwos the pattern
test_leafdynamics_wofost72_*.yaml. Each file contains different data depending on the locatin and crop type.
For example, you can download the file "test_leafdynamics_wofost72_01.yaml" as:
import urllib.request
url = "https://raw.githubusercontent.com/ajwdewit/pcse/refs/heads/master/tests/test_data/test_leafdynamics_wofost72_01.yaml"
filename = "test_leafdynamics_wofost72_01.yaml"
urllib.request.urlretrieve(url, filename)
print(f"Downloaded: {filename}")
Downloaded: test_leafdynamics_wofost72_01.yaml
We also need to download a config file to be able to run each crop module. This will change in the future versions. To donwload the config file, you can use the following command:
url = "https://raw.githubusercontent.com/WUR-AI/diffWOFOST/refs/heads/main/tests/physical_models/test_data/WOFOST_Leaf_Dynamics.conf"
filename = "WOFOST_Leaf_Dynamics.conf"
urllib.request.urlretrieve(url, filename)
print(f"Downloaded: {filename}")
Downloaded: WOFOST_Leaf_Dynamics.conf
# ---- Check the path to the files that are downloaded as explained above ----
test_data_path = "test_leafdynamics_wofost72_01.yaml"
config_path = "WOFOST_Leaf_Dynamics.conf"
# ---- Here we read the test data and set some variables ----
(crop_model_params_provider, weather_data_provider, agro_management_inputs, external_states) = (
prepare_engine_input(test_data_path, ["SPAN", "TDWI", "TBASE", "PERDL", "RGRLAI"])
)
expected_results = yaml.safe_load(open(test_data_path))["ModelResults"]
expected_lai_twlv = torch.tensor(
[[float(item["LAI"]), float(item["TWLV"])] for item in expected_results], dtype=torch.float32
).unsqueeze(0) # shape: [1, time_steps, 2]
# ---- dont change this: in this config file we specified the diffrentiable version of leaf_dynamics ----
config_path = str(Path(config_path).resolve())
1.3. Helper classes/functions¶
The model parameters shoudl stay in a valid range. To ensure this, we will use
BoundedParameter class with (min, max) and initial values for each
parameter. You might change these values depending on the crop type and
location. But dont use a very small range, otherwise gradiants will be very
small and the optimization will be very slow.
# ---- Adjust the values if needed ----
TDWI_MIN, TDWI_MAX, TDWI_INIT = (0.0, 1.0, 0.40)
SPAN_MIN, SPAN_MAX, SPAN_INIT = (10.0, 60.0, 30.0)
# ---- Helper for bounded parameters ----
class BoundedParameter(torch.nn.Module):
def __init__(self, low, high, init_value):
super().__init__()
self.low = low
self.high = high
# Normalize to [0, 1]
init_norm = (init_value - low) / (high - low)
# Parameter in raw logit space
self.raw = torch.nn.Parameter(torch.logit(torch.tensor(init_norm, dtype=torch.float32), eps=1e-6))
def forward(self):
return self.low + (self.high - self.low) * torch.sigmoid(self.raw)
Another helper class is OptDiffLeafDynamics which is a subclass of torch.nn.Module.
We use this class to wrap the EngineTestHelper function and make it easier to run the model leaf_dynamic.
# ---- Wrap the model with torch.nn.Module----
class OptDiffLeafDynamics(torch.nn.Module):
def __init__(self, crop_model_params_provider, weather_data_provider, agro_management_inputs, config_path, external_states):
super().__init__()
self.crop_model_params_provider = crop_model_params_provider
self.weather_data_provider = weather_data_provider
self.agro_management_inputs = agro_management_inputs
self.config_path = config_path
self.external_states = external_states
# bounded parameters
self.tdwi = BoundedParameter(TDWI_MIN, TDWI_MAX, init_value=TDWI_INIT)
self.span = BoundedParameter(SPAN_MIN, SPAN_MAX, init_value=SPAN_INIT)
def forward(self):
# currently, copying is needed due to an internal issue in engine
crop_model_params_provider_ = copy.deepcopy(self.crop_model_params_provider)
external_states_ = copy.deepcopy(self.external_states)
tdwi_val = self.tdwi()
span_val = self.span()
# pass new value of parameters to the model
crop_model_params_provider_.set_override("TDWI", tdwi_val, check=False)
crop_model_params_provider_.set_override("SPAN", span_val, check=False)
engine = EngineTestHelper(
crop_model_params_provider_,
self.weather_data_provider,
self.agro_management_inputs,
self.config_path,
external_states_,
)
engine.run_till_terminate()
results = engine.get_output()
return torch.stack(
[torch.stack([item["LAI"], item["TWLV"]]) for item in results]
).unsqueeze(0) # shape: [1, time_steps, 2]
# ---- Create model ----
opt_model = OptDiffLeafDynamics(
crop_model_params_provider,
weather_data_provider,
agro_management_inputs,
config_path,
external_states,
)
# ---- Optimizer ----
optimizer = torch.optim.Adam(opt_model.parameters(), lr=0.1)
# ---- We use relative MAE as loss because there are two outputs with different untis ----
denom = torch.mean(torch.abs(expected_lai_twlv), dim=1)
# Training loop (example)
for step in range(101):
optimizer.zero_grad()
results = opt_model()
mae = torch.mean(torch.abs(results - expected_lai_twlv), dim=1)
rmae = mae / denom
loss = rmae.sum() # example: relative mean absolute error
loss.backward()
optimizer.step()
if step % 10 == 0:
print(f"Step {step}, Loss {loss.item():.4f}, TDWI {opt_model.tdwi().item():.4f}, SPAN {opt_model.span().item():.4f}")
Step 0, Loss 0.1348, TDWI 0.4242, SPAN 31.2111 Step 10, Loss 0.0589, TDWI 0.4890, SPAN 36.7883 Step 20, Loss 0.0174, TDWI 0.5191, SPAN 34.8344 Step 30, Loss 0.0113, TDWI 0.5056, SPAN 35.0567 Step 40, Loss 0.0067, TDWI 0.5100, SPAN 35.4831 Step 50, Loss 0.0121, TDWI 0.5019, SPAN 34.6534 Step 60, Loss 0.0005, TDWI 0.5038, SPAN 34.6887 Step 70, Loss 0.0079, TDWI 0.5015, SPAN 35.7478 Step 80, Loss 0.0145, TDWI 0.5061, SPAN 34.5574 Step 90, Loss 0.0110, TDWI 0.4985, SPAN 34.3927 Step 100, Loss 0.0271, TDWI 0.5064, SPAN 36.3026
# ---- validate the results using test data ----
print(f"Actual TDWI {crop_model_params_provider["TDWI"].item():.4f}, SPAN {crop_model_params_provider["SPAN"].item():.4f}")
Actual TDWI 0.5100, SPAN 35.0000
2. Root dynamics¶
In this section, we will demonstrate how to optimize two parameters TWDI in
root_dynamics model using a differentiable version of root_dynamics.
The optimization will be done using the Adam optimizer from torch.optim.
2.1 software requirements¶
To run this notebook, we need to install the diffwofost; the differentiable
version of WOFOST models. Since the package is constantly under development, make
sure you have the latest version of diffwofost installed in your
python environment. You can install it using pip:
pip install diffwofost
# ---- import libraries ----
import copy
import torch
import numpy
import yaml
from pathlib import Path
from diffwofost.physical_models.utils import EngineTestHelper
from diffwofost.physical_models.utils import prepare_engine_input
# ---- disable a warning: this will be fixed in the future ----
import warnings
warnings.filterwarnings("ignore", message="To copy construct from a tensor.*")
2.2. Data¶
A test dataset of TWRT (Total weight of roots) will be used to optimize
parametesr TWDI (total initial dry weight). Note that in root_dynamic, changes in TWDI dont affect RD (Current rooting depth).
The data is stored in PCSE tests folder, and can be doewnloded from PCSE repsository.
You can select any of the files related to root_dynamics model with a file name that follwos the pattern
test_rootdynamics_wofost72_*.yaml. Each file contains different data depending on the locatin and crop type.
For example, you can download the file "test_rootdynamics_wofost72_01.yaml" as:
import urllib.request
url = "https://raw.githubusercontent.com/ajwdewit/pcse/refs/heads/master/tests/test_data/test_rootdynamics_wofost72_01.yaml"
filename = "test_rootdynamics_wofost72_01.yaml"
urllib.request.urlretrieve(url, filename)
print(f"Downloaded: {filename}")
Downloaded: test_rootdynamics_wofost72_01.yaml
We also need to download a config file to be able to run each crop module. This will change in the future versions. To donwload the config file, you can use the following command:
url = "https://raw.githubusercontent.com/WUR-AI/diffWOFOST/refs/heads/main/tests/physical_models/test_data/WOFOST_Root_Dynamics.conf"
filename = "WOFOST_Root_Dynamics.conf"
urllib.request.urlretrieve(url, filename)
print(f"Downloaded: {filename}")
Downloaded: WOFOST_Root_Dynamics.conf
# ---- Check the path to the files that are downloaded as explained above ----
test_data_path = "test_rootdynamics_wofost72_01.yaml"
config_path = "WOFOST_Root_Dynamics.conf"
# ---- Here we read the test data and set some variables ----
(crop_model_params_provider, weather_data_provider, agro_management_inputs, external_states) = (
prepare_engine_input(test_data_path, ["RDI", "RRI", "RDMCR", "RDMSOL", "TDWI", "IAIRDU"])
)
expected_results = yaml.safe_load(open(test_data_path))["ModelResults"]
expected_twrt = torch.tensor(
[float(item["TWRT"]) for item in expected_results], dtype=torch.float32
) # shape: [1, time_steps]
# ---- dont change this: in this config file we specified the diffrentiable version of root_dynamics ----
config_path = str(Path(config_path).resolve())
2.3. Helper classes/functions¶
The model parameters shoudl stay in a valid range. To ensure this, we will use
BoundedParameter class with (min, max) and initial values for each
parameter. You might change these values depending on the crop type and
location. But dont use a very small range, otherwise gradiants will be very
small and the optimization will be very slow.
# ---- Adjust the values if needed ----
TDWI_MIN, TDWI_MAX, TDWI_INIT = (0.0, 1.0, 0.30)
# ---- Helper for bounded parameters ----
class BoundedParameter(torch.nn.Module):
def __init__(self, low, high, init_value):
super().__init__()
self.low = low
self.high = high
# Normalize to [0, 1]
init_norm = (init_value - low) / (high - low)
# Parameter in raw logit space
self.raw = torch.nn.Parameter(torch.logit(torch.tensor(init_norm, dtype=torch.float32), eps=1e-6))
def forward(self):
return self.low + (self.high - self.low) * torch.sigmoid(self.raw)
Another helper class is OptRootLeafDynamics which is a subclass of torch.nn.Module.
We use this class to wrap the EngineTestHelper function and make it easier to run the model root_dynamic.
# ---- Wrap the model with torch.nn.Module----
class OptDiffRootDynamics(torch.nn.Module):
def __init__(self, crop_model_params_provider, weather_data_provider, agro_management_inputs, config_path, external_states):
super().__init__()
self.crop_model_params_provider = crop_model_params_provider
self.weather_data_provider = weather_data_provider
self.agro_management_inputs = agro_management_inputs
self.config_path = config_path
self.external_states = external_states
# bounded parameters
self.tdwi = BoundedParameter(TDWI_MIN, TDWI_MAX, init_value=TDWI_INIT)
def forward(self):
# currently, copying is needed due to an internal issue in engine
crop_model_params_provider_ = copy.deepcopy(self.crop_model_params_provider)
external_states_ = copy.deepcopy(self.external_states)
tdwi_val = self.tdwi()
# pass new value of parameters to the model
crop_model_params_provider_.set_override("TDWI", tdwi_val, check=False)
engine = EngineTestHelper(
crop_model_params_provider_,
self.weather_data_provider,
self.agro_management_inputs,
self.config_path,
external_states_,
)
engine.run_till_terminate()
results = engine.get_output()
return torch.stack([item["TWRT"] for item in results]) # shape: [1, time_steps]
# ---- Create model ----
opt_model = OptDiffRootDynamics(
crop_model_params_provider,
weather_data_provider,
agro_management_inputs,
config_path,
external_states,
)
# ---- Optimizer ----
optimizer = torch.optim.Adam(opt_model.parameters(), lr=0.1)
# ---- We use relative MAE as loss because there are two outputs with different untis ----
denom = torch.mean(torch.abs(expected_twrt))
# Training loop (example)
for step in range(101):
optimizer.zero_grad()
results = opt_model()
mae = torch.mean(torch.abs(results - expected_twrt))
loss = mae / denom # example: relative mean absolute error
loss.backward()
optimizer.step()
if step % 10 == 0:
print(f"Step {step}, Loss {loss.item():.8f}, TDWI {opt_model.tdwi().item():.4f}")
Step 0, Loss 0.00000021, TDWI 0.5340 Step 10, Loss 0.00000015, TDWI 0.5137 Step 20, Loss 0.00000164, TDWI 0.5144 Step 30, Loss 0.00000068, TDWI 0.5107 Step 40, Loss 0.00000019, TDWI 0.5072 Step 50, Loss 0.00000080, TDWI 0.5084 Step 60, Loss 0.00000007, TDWI 0.5071 Step 70, Loss 0.00000100, TDWI 0.5091 Step 80, Loss 0.00000131, TDWI 0.5089 Step 90, Loss 0.00000068, TDWI 0.5115 Step 100, Loss 0.00000071, TDWI 0.5053
# ---- validate the results using test data ----
print(f"Actual TDWI {crop_model_params_provider["TDWI"].item():.4f}")
Actual TDWI 0.5100