pydoublelasso


Namepydoublelasso JSON
Version 0.1.0 PyPI version JSON
download
home_pagehttps://github.com/vyasenov/pydoublelasso
SummaryPython implementation of the Double Post-Lasso estimator for treatment effect estimation
upload_time2025-07-10 23:31:22
maintainerNone
docs_urlNone
authorVasco Yasenov
requires_python>=3.7
licenseNone
keywords lasso treatment effects causal inference high-dimensional econometrics
VCS
bugtrack_url
requirements numpy scikit-learn statsmodels pandas
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # pydoublelasso

A Python package for estimating treatment effects using the Double Post-Lasso procedure from Belloni, Chernozhukov, and Hansen (2014). This method is designed for valid inference in the presence of many covariates, using Lasso for model selection followed by OLS for estimation.

## Installation

You can install the package using pip:

```bash
pip install pydoublelasso
````

## Features

* High-dimensional treatment effect estimation with many covariates
* Double selection Lasso for robust control variable selection
* Post-selection inference with valid confidence intervals
* Supports binary or continuous treatment variables
* Bootstrap and asymptotic confidence intervals
* Easy integration with pandas and scikit-learn pipelines

## Quick Start

```python
import pandas as pd
import numpy as np
from pydoublelasso import DoublePostLasso

# Generate synthetic data
np.random.seed(1988)
n_obs, n_features = 1000, 50

# Generate covariates
X = np.random.randn(n, p)
D = X[:, 0] + 0.5 * X[:, 1] + np.random.randn(n) * 0.5  # Treatment depends on X0, X1
Y = 2 * D + X[:, 2] + np.random.randn(n) * 0.5  # Outcome depends on D and X2

# Run Double Post-Lasso
model = DoublePostLasso()
model.fit(X, D, Y)

# Get selected variables
print("Selected variables:", model.selected_vars_)

# Make predictions
y_pred = model.predict(X.values)
print("First 5 predictions:", y_pred[:5])
```

## Examples

See the `examples/` directory for use cases including:

## Background

### Why Double Lasso?

When estimating a treatment effect, including too many irrelevant controls inflates variance, while omitting important confounders introduces bias. In high-dimensional settings, Lasso helps by selecting a sparse set of relevant covariates. However, two problems arise: (1) standard confidence intervals after Lasso are invalid due to model selection, and (2) Lasso estimates are biased toward zero because of regularization.

Double Post-Lasso, proposed by Belloni, Chernozhukov, and Hansen (2014), addresses this by performing variable selection in both the outcome and treatment equations. This approach ensures that the model controls for variables that influence either the treatment or the outcome, yielding valid estimates and confidence intervals for the treatment effect.

---

### Notation

Let's establish the following notation:

* $Y$: outcome variable
* $D$: treatment variable
* $X = (X_1, \dots, X_p)$: high-dimensional controls variables

---

### Estimation

The goal is to estimate the partial effect of $D$ on $Y$, denoted $\alpha$, in the partially linear model:

$$
Y_i = \alpha D_i + f(X_i) + \varepsilon_i
$$

The Double Post-Lasso procedure follows:

1. Fit Lasso of $Y \sim X$, selecting variables $\hat{S}_Y$
2. Fit Lasso of $D \sim X$, selecting variables $\hat{S}_D$
3. Define selected set $\hat{S} = \hat{S}_Y \cup \hat{S}_D$
4. Estimate $\alpha$ by OLS on:

$$
Y_i = \alpha D_i + X_{i,\hat{S}}^\top \beta + \varepsilon_i
$$

Belloni et al. (2014) show this final regression delivers a consistent and asymptotically normal estimator of $\alpha$.

---

### Assumptions

The method relies on the following key assumptions:

* Sparsity: The true regression functions depend only on a small subset of covariates
* Exogeneity: $D$ is exogenous after controlling for $X$
* Approximate linearity: The relationships $Y \sim X$ and $D \sim X$ can be well-approximated linearly
* Regularization: Lasso is appropriately tuned for consistent variable selection

---

### Confidence Intervals

The final post-Lasso OLS regression produces valid asymptotic standard errors, even though variable selection was performed. Additionally, the package supports bootstrap confidence intervals which account for randomness in both the selection and estimation stages.

---

## References

* Belloni, A., Chernozhukov, V., & Hansen, C. (2014). *Inference on treatment effects after selection among high-dimensional controls*. *The Review of Economic Studies*, 81(2), 608–650.
* Tibshirani, R. (1996). *Regression shrinkage and selection via the Lasso*. *Journal of the Royal Statistical Society: Series B*, 58(1), 267–288.

## License

This project is licensed under the MIT License – see the [LICENSE](LICENSE) file for details.

## Citation

To cite this package in publications, use the following BibTeX entry:

```bibtex
@misc{yasenov2025pydoublelasso,
  author       = {Vasco Yasenov},
  title        = {pydoublelasso: Python Implementation of the Double Post-Lasso Estimator},
  year         = {2025},
  howpublished = {\url{https://github.com/vyasenov/pydoublelasso}},
  note         = {Version 0.1.0}
}
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/vyasenov/pydoublelasso",
    "name": "pydoublelasso",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "lasso, treatment effects, causal inference, high-dimensional, econometrics",
    "author": "Vasco Yasenov",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/09/2c/cd44699aa2e1f3ffa053c9bd49a6b45135cf62bbbae0c450b8fa8f872589/pydoublelasso-0.1.0.tar.gz",
    "platform": null,
    "description": "# pydoublelasso\n\nA Python package for estimating treatment effects using the Double Post-Lasso procedure from Belloni, Chernozhukov, and Hansen (2014). This method is designed for valid inference in the presence of many covariates, using Lasso for model selection followed by OLS for estimation.\n\n## Installation\n\nYou can install the package using pip:\n\n```bash\npip install pydoublelasso\n````\n\n## Features\n\n* High-dimensional treatment effect estimation with many covariates\n* Double selection Lasso for robust control variable selection\n* Post-selection inference with valid confidence intervals\n* Supports binary or continuous treatment variables\n* Bootstrap and asymptotic confidence intervals\n* Easy integration with pandas and scikit-learn pipelines\n\n## Quick Start\n\n```python\nimport pandas as pd\nimport numpy as np\nfrom pydoublelasso import DoublePostLasso\n\n# Generate synthetic data\nnp.random.seed(1988)\nn_obs, n_features = 1000, 50\n\n# Generate covariates\nX = np.random.randn(n, p)\nD = X[:, 0] + 0.5 * X[:, 1] + np.random.randn(n) * 0.5  # Treatment depends on X0, X1\nY = 2 * D + X[:, 2] + np.random.randn(n) * 0.5  # Outcome depends on D and X2\n\n# Run Double Post-Lasso\nmodel = DoublePostLasso()\nmodel.fit(X, D, Y)\n\n# Get selected variables\nprint(\"Selected variables:\", model.selected_vars_)\n\n# Make predictions\ny_pred = model.predict(X.values)\nprint(\"First 5 predictions:\", y_pred[:5])\n```\n\n## Examples\n\nSee the `examples/` directory for use cases including:\n\n## Background\n\n### Why Double Lasso?\n\nWhen estimating a treatment effect, including too many irrelevant controls inflates variance, while omitting important confounders introduces bias. In high-dimensional settings, Lasso helps by selecting a sparse set of relevant covariates. However, two problems arise: (1) standard confidence intervals after Lasso are invalid due to model selection, and (2) Lasso estimates are biased toward zero because of regularization.\n\nDouble Post-Lasso, proposed by Belloni, Chernozhukov, and Hansen (2014), addresses this by performing variable selection in both the outcome and treatment equations. This approach ensures that the model controls for variables that influence either the treatment or the outcome, yielding valid estimates and confidence intervals for the treatment effect.\n\n---\n\n### Notation\n\nLet's establish the following notation:\n\n* $Y$: outcome variable\n* $D$: treatment variable\n* $X = (X_1, \\dots, X_p)$: high-dimensional controls variables\n\n---\n\n### Estimation\n\nThe goal is to estimate the partial effect of $D$ on $Y$, denoted $\\alpha$, in the partially linear model:\n\n$$\nY_i = \\alpha D_i + f(X_i) + \\varepsilon_i\n$$\n\nThe Double Post-Lasso procedure follows:\n\n1. Fit Lasso of $Y \\sim X$, selecting variables $\\hat{S}_Y$\n2. Fit Lasso of $D \\sim X$, selecting variables $\\hat{S}_D$\n3. Define selected set $\\hat{S} = \\hat{S}_Y \\cup \\hat{S}_D$\n4. Estimate $\\alpha$ by OLS on:\n\n$$\nY_i = \\alpha D_i + X_{i,\\hat{S}}^\\top \\beta + \\varepsilon_i\n$$\n\nBelloni et al. (2014) show this final regression delivers a consistent and asymptotically normal estimator of $\\alpha$.\n\n---\n\n### Assumptions\n\nThe method relies on the following key assumptions:\n\n* Sparsity: The true regression functions depend only on a small subset of covariates\n* Exogeneity: $D$ is exogenous after controlling for $X$\n* Approximate linearity: The relationships $Y \\sim X$ and $D \\sim X$ can be well-approximated linearly\n* Regularization: Lasso is appropriately tuned for consistent variable selection\n\n---\n\n### Confidence Intervals\n\nThe final post-Lasso OLS regression produces valid asymptotic standard errors, even though variable selection was performed. Additionally, the package supports bootstrap confidence intervals which account for randomness in both the selection and estimation stages.\n\n---\n\n## References\n\n* Belloni, A., Chernozhukov, V., & Hansen, C. (2014). *Inference on treatment effects after selection among high-dimensional controls*. *The Review of Economic Studies*, 81(2), 608\u2013650.\n* Tibshirani, R. (1996). *Regression shrinkage and selection via the Lasso*. *Journal of the Royal Statistical Society: Series B*, 58(1), 267\u2013288.\n\n## License\n\nThis project is licensed under the MIT License \u2013 see the [LICENSE](LICENSE) file for details.\n\n## Citation\n\nTo cite this package in publications, use the following BibTeX entry:\n\n```bibtex\n@misc{yasenov2025pydoublelasso,\n  author       = {Vasco Yasenov},\n  title        = {pydoublelasso: Python Implementation of the Double Post-Lasso Estimator},\n  year         = {2025},\n  howpublished = {\\url{https://github.com/vyasenov/pydoublelasso}},\n  note         = {Version 0.1.0}\n}\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Python implementation of the Double Post-Lasso estimator for treatment effect estimation",
    "version": "0.1.0",
    "project_urls": {
        "Bug Reports": "https://github.com/vyasenov/pydoublelasso/issues",
        "Documentation": "https://github.com/vyasenov/pydoublelasso#readme",
        "Homepage": "https://github.com/vyasenov/pydoublelasso",
        "Source": "https://github.com/vyasenov/pydoublelasso"
    },
    "split_keywords": [
        "lasso",
        " treatment effects",
        " causal inference",
        " high-dimensional",
        " econometrics"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "879bf9d5a3720f938123f1d28312c0ab92a8ebf8ae73ff13c7cc01a369378e78",
                "md5": "727b4914c912cf4e1f7e16c079260341",
                "sha256": "1558c600ed7b85c8e2897a0fa53ad5105018ad1666f5554c1714b8ec71752f1d"
            },
            "downloads": -1,
            "filename": "pydoublelasso-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "727b4914c912cf4e1f7e16c079260341",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 8436,
            "upload_time": "2025-07-10T23:31:21",
            "upload_time_iso_8601": "2025-07-10T23:31:21.709099Z",
            "url": "https://files.pythonhosted.org/packages/87/9b/f9d5a3720f938123f1d28312c0ab92a8ebf8ae73ff13c7cc01a369378e78/pydoublelasso-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "092ccd44699aa2e1f3ffa053c9bd49a6b45135cf62bbbae0c450b8fa8f872589",
                "md5": "0c188690fa56edda8a3831ba71afb658",
                "sha256": "b5ebfa49e383da9a2d1f325b51101612c10899cc757de240f742d5960fec342a"
            },
            "downloads": -1,
            "filename": "pydoublelasso-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "0c188690fa56edda8a3831ba71afb658",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 8991,
            "upload_time": "2025-07-10T23:31:22",
            "upload_time_iso_8601": "2025-07-10T23:31:22.914270Z",
            "url": "https://files.pythonhosted.org/packages/09/2c/cd44699aa2e1f3ffa053c9bd49a6b45135cf62bbbae0c450b8fa8f872589/pydoublelasso-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-10 23:31:22",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "vyasenov",
    "github_project": "pydoublelasso",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "numpy",
            "specs": []
        },
        {
            "name": "scikit-learn",
            "specs": []
        },
        {
            "name": "statsmodels",
            "specs": []
        },
        {
            "name": "pandas",
            "specs": []
        }
    ],
    "lcname": "pydoublelasso"
}
        
Elapsed time: 0.42016s