SoftmaxWeightedSumFitter.priors_from_data#

SoftmaxWeightedSumFitter.priors_from_data(X, y)[source]#

Set Normal prior for logit weights based on number of control units.

The prior is placed on N - 1 unconstrained logits (the first logit is pinned to zero). The default scale sigma=1.0 provides moderate regularization, equivalent to zeta=1.0 in the frequentist SDiD.

Unlike WeightedSumFitter.priors_from_data(), which must read X.shape[1] to size the Dirichlet concentration vector, the Normal prior here broadcasts automatically via its dims, so the data shape is not needed.

To control regularization strength, pass a custom beta_raw prior:

# Tighter regularization (more DiD-like, near-uniform weights):
model = SoftmaxWeightedSumFitter(
    priors={
        "beta_raw": Prior(
            "Normal", mu=0, sigma=0.1, dims=["treated_units", "coeffs_raw"]
        )
    }
)

# Looser regularization (more SC-like, data-driven sparse weights):
model = SoftmaxWeightedSumFitter(
    priors={
        "beta_raw": Prior(
            "Normal", mu=0, sigma=10, dims=["treated_units", "coeffs_raw"]
        )
    }
)
Parameters:
Returns:

Dictionary containing: - “beta_raw”: Normal prior with dims [“treated_units”, “coeffs_raw”]

Return type:

dict[str, Prior]