SoftmaxWeightedSumFitter.priors_from_data#
- SoftmaxWeightedSumFitter.priors_from_data(X, y)[source]#
Set Normal prior for logit weights based on number of control units.
The prior is placed on
N - 1unconstrained logits (the first logit is pinned to zero). The default scalesigma=1.0provides moderate regularization, equivalent tozeta=1.0in the frequentist SDiD.Unlike
WeightedSumFitter.priors_from_data(), which must readX.shape[1]to size the Dirichlet concentration vector, the Normal prior here broadcasts automatically via itsdims, so the data shape is not needed.To control regularization strength, pass a custom
beta_rawprior:# Tighter regularization (more DiD-like, near-uniform weights): model = SoftmaxWeightedSumFitter( priors={ "beta_raw": Prior( "Normal", mu=0, sigma=0.1, dims=["treated_units", "coeffs_raw"] ) } ) # Looser regularization (more SC-like, data-driven sparse weights): model = SoftmaxWeightedSumFitter( priors={ "beta_raw": Prior( "Normal", mu=0, sigma=10, dims=["treated_units", "coeffs_raw"] ) } )
- Parameters:
X (xarray.DataArray) – Control unit data with shape (n_obs, n_control_units).
y (xarray.DataArray) – Treated unit outcome data.
- Returns:
Dictionary containing: - “beta_raw”: Normal prior with dims [“treated_units”, “coeffs_raw”]
- Return type: