rkhs


Namerkhs JSON
Version 0.0.1 PyPI version JSON
download
home_pagehttps://github.com/lukashaverbeck/rkhs
SummaryA unified framework for marginal and conditional two-sample testing with kernels in JAX.
upload_time2025-08-27 11:07:06
maintainerNone
docs_urlNone
authorLukas Haverbeck
requires_python>=3.13
licenseMIT License Copyright (c) 2024 Lukas Haverbeck Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords kernels jax two-sample-test
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # 🌱 rkhs

`rkhs` is a small Python framework for marginal and conditional two-sample testing with kernels in JAX.

---

## ⚙️ Installation
```bash
pip install rkhs
```

---

## 🚀 Features
`rkhs` provides JAX-native two-sample tests (marginal, conditional, mixed) based on kernel embeddings with analytical or bootstrap confidence bounds, a simple API, and pluggable kernels.

### Three test modes — one API 
You can test the following two-sample hypotheses with one common set of primitives: 
  - **Marginal:** $H_0: P = Q$
  - **Conditional:** $H_0(x_1,x_2): P(\cdot\mid X=x_1) = Q(\cdot\mid X=x_2)$
  - **Mixed:** $H_0(x): P(\cdot\mid X=x) = Q$

The test compares kernel embeddings in RKHS norm and rejects $H_0$ at level $\alpha$ if

$$
  \|\hat\mu_P - \hat\mu_Q\|_\mathcal{H} > \beta_P + \beta_Q \quad,
$$

where $\beta_\ast$ are finite-sample confidence radii from the selected regime.

### Confidence regimes
**Analytical bounds.** Finite-sample guarantees under the stated assumptions (conservative, little overhead).

**Bootstrap bounds.** Data-driven thresholds with typically higher power (cost scales with the number of resamples).

### JAX integration
Works with `jit`/`vmap`, runs on CPU/GPU/TPU, and uses explicit `PRNGKey` for reproducibility.

### Kernels
Popular kernels are built in: `Gaussian`, `Matern`, `Laplacian`, `Polynomial`, `Linear`.

Conditional tests use a scalar kernel on the input domain and a separate kernel on the output domain: 
`VectorKernel(x=..., y=..., regularization=...)`.

---

## 🧩 Usage

### 1) Marginal two-sample test (analytical bounds)

```python
import jax
from rkhs.testing import TestEmbedding, TwoSampleTest
from rkhs.kernels import GaussianKernel

# toy data: two 3D Gaussians with different means
xs_1 = jax.random.normal(key=jax.random.key(1), shape=(200, 3))
xs_2 = jax.random.normal(key=jax.random.key(2), shape=(200, 3)) + 1.0

# kernel on the sample space
kernel = GaussianKernel(bandwidth=1.5, data_shape=(3,))

# embedding + analytical confidence radius
kme_1 = TestEmbedding.analytical(
    kme=kernel.kme(xs_1),   # embed dataset in RKHS
    kernel_bound=1.0        # sup_x k(x, x)
)
kme_2 = TestEmbedding.analytical(
    kme=kernel.kme(xs_2),   # embed dataset in RKHS
    kernel_bound=1.0        # sup_x k(x, x)
) 

# level-α test
test = TwoSampleTest.from_embeddings(kme_1, kme_2, level=0.05)

decision = test.reject      # boolean (reject H_0?)
distance = test.distance    # RKHS distance
threshold = test.threshold  # β_P + β_Q
print(decision, distance, threshold)
```
### 2) Marginal test (bootstrap bounds)

```python
import jax
from rkhs.testing import TestEmbedding, TwoSampleTest
from rkhs.kernels import GaussianKernel

# toy data: two 3D Gaussians with different means
xs_1 = jax.random.normal(key=jax.random.key(1), shape=(200, 3))
xs_2 = jax.random.normal(key=jax.random.key(2), shape=(200, 3)) + 1.0

# kernel on the sample space
kernel = GaussianKernel(bandwidth=1.5, data_shape=(3,))

# embedding + analytical confidence radius
kme_1 = TestEmbedding.bootstrap(
    kme=kernel.kme(xs_1),   # embed dataset in RKHS
    key=jax.random.key(3),  # random key
    n_bootstrap=1000        # number of bootstrap resamples
)
kme_2 = TestEmbedding.bootstrap(
    kme=kernel.kme(xs_2),   # embed dataset in RKHS
    key=jax.random.key(4),  # random key
    n_bootstrap=1000        # number of bootstrap resamples
) 

# level-α test
test = TwoSampleTest.from_embeddings(kme_1, kme_2, level=0.05)

decision = test.reject      # boolean (reject H_0?)
distance = test.distance    # RKHS distance
threshold = test.threshold  # β_P + β_Q
print(decision, distance, threshold)
```

### 3) Conditional two-sample test at selected covariates

```python
import jax
from rkhs import VectorKernel
from rkhs.testing import ConditionalTestEmbedding, TwoSampleTest
from rkhs.kernels import GaussianKernel

# synthetic x,y pairs with additive noise
xs_1 = jax.random.normal(key=jax.random.key(1), shape=(1000, 3))
ys_1 = xs_1 + jax.random.normal(key=jax.random.key(2), shape=(1000, 3)) * 0.05

xs_2 = jax.random.normal(key=jax.random.key(3), shape=(1000, 3))
ys_2 = xs_2 + jax.random.normal(key=jax.random.key(4), shape=(1000, 3)) * 0.05 + 0.5

# inputs at which to test for distributional equality: H_0(x): P(. | x) =? Q(. | x), for x in grid
covariates = jax.numpy.linspace(jax.numpy.array([-3, -3, -3]), jax.numpy.array([3, 3, 3]), num=100)

# vector-valued kernel over inputs X and outputs Y
kernel = VectorKernel(
    x=GaussianKernel(bandwidth=0.5, data_shape=(3,)),  # kernel used for ridge regression
    y=GaussianKernel(bandwidth=1.0, data_shape=(3,)),  # kernel used for embedding marginal distribution at each covariate
    regularization=0.1
)

# conditional embedding + bootstrap confidence radius
cme_1 = ConditionalTestEmbedding.bootstrap(
    cme=kernel.cme(xs_1, ys_1), # embed dataset in vector-valued RKHS
    grid=covariates,            # covariates used in bootstrap of threshold parameters
    key=jax.random.key(5),      # random key
    n_bootstrap=100             # number of bootstrap resamples
)

cme_2 = ConditionalTestEmbedding.bootstrap(
    cme=kernel.cme(xs_2, ys_2), # embed dataset in vector-valued RKHS
    grid=covariates,            # covariates used in bootstrap of threshold parameters
    key=jax.random.key(6),      # random key
    n_bootstrap=100             # number of bootstrap resamples
)

# evaluate CMEs at covariates -> embeds each distribution over Y in RKHS of `kernel.y`
kme_1 = cme_1(covariates)
kme_2 = cme_2(covariates)

# batched test across all covariates
test = TwoSampleTest.from_embeddings(kme_1, kme_2, level=0.05)
reject_per_x = test.reject   # Boolean array

decision = test.reject      # boolean (reject H_0(x)?, individually for each covariate). shape: covariates.shape
distance = test.distance    # RKHS distance. shape: covariates.shape
threshold = test.threshold  # β_P + β_Q
print(decision, distance, threshold)
```

### 4) Mixed test: $P(\cdot\mid X=x)$ vs. $Q$

```python
import jax
from rkhs import VectorKernel
from rkhs.testing import TestEmbedding, ConditionalTestEmbedding, TwoSampleTest
from rkhs.kernels import GaussianKernel

# dataset from marginal distribution over Y
ys_1 = jax.random.normal(key=jax.random.key(1), shape=(1000, 3)) * 0.05

# synthetic x,y pairs with additive noise
xs_2 = jax.random.normal(key=jax.random.key(2), shape=(1000, 3))
ys_2 = xs_2 + jax.random.normal(key=jax.random.key(3), shape=(1000, 3)) * 0.05

# inputs at which to test for distributional equality: H_0(x): P =? Q(. | x), for x in grid
covariates = jax.numpy.linspace(jax.numpy.array([-3, -3, -3]), jax.numpy.array([3, 3, 3]), num=200)

y_kernel = GaussianKernel(bandwidth=1.0, data_shape=(3,))

# vector-valued kernel over inputs X and outputs Y
vector_kernel = VectorKernel(
    x=GaussianKernel(bandwidth=0.5, data_shape=(3,)),  # kernel used for ridge regression
    y=y_kernel,                                        # kernel used for embedding marginal distribution at each covariate
    regularization=0.1
)

# embedding + analytical confidence radius (can be drop-in replaced with bootstrap radius)
kme_1 = TestEmbedding.analytical(
    kme=y_kernel.kme(ys_1), # embed dataset in RKHS
    kernel_bound=1.0        # sup_x k(x, x)
)

# conditional embedding + analytical confidence radius
cme_2 = ConditionalTestEmbedding.bootstrap(
    cme=vector_kernel.cme(xs_2, ys_2),  # embed dataset in vector-valued RKHS
    grid=covariates,                    # covariates used in bootstrap of threshold parameters
    key=jax.random.key(4),              # random key
    n_bootstrap=100                     # number of bootstrap resamples
)

# evaluate CME at covariates -> embeds each distribution over Y in RKHS of `kernel.y`
kme_2 = cme_2(covariates)

# batched test across all covariates
test = TwoSampleTest.from_embeddings(kme_1, kme_2, level=0.05)
reject_per_x = test.reject   # Boolean array

decision = test.reject      # boolean (reject H_0(x)?, individually for each covariate). shape: covariates.shape
distance = test.distance    # RKHS distance. shape: covariates.shape
threshold = test.threshold  # β_P + β_Q
print(decision, distance, threshold)
```

---

## 🔍 Kernel quick reference

- `LinearKernel` — compares means (first moment).
- `PolynomialKernel(degree=d)` — compares moments up to degree $d$.
- `Gaussian`, `Matern`, `Laplacian` — characteristic; compare full distributions.

For conditional tests:
- **Input kernel (`x`)**: used to learn the conditional embedding (not for comparison).
- **Output kernel (`y`)**: determines what aspects of the conditional law are compared.

---

## 🧠 Notes
- Embeddings preserve batch axes; passing a batch of covariates returns a batch of embeddings.
- All randomness is explicit via `jax.random.PRNGKey`.
- You can use your own custom kernel by extending `rkhs.Kernel`:

```python
from jax import Array
from rkhs import Kernel
import jax

class MyCustomKernel(Kernel):
    def __init__(self, data_shape: tuple[int, ...]):
        super().__init__(data_shape)
        ...
    
    def _dot(self, x1: Array, x2: Array) -> Array:
        ...  # your logic here (must be jit-compilable)
```

---

## 📚 References

- Marginal test: Gretton, A., et al. (2012). *A Kernel Two-Sample Test*. [JMLR page](https://jmlr.csail.mit.edu/papers/v13/gretton12a.html) · [PDF](https://www.jmlr.org/papers/volume13/gretton12a/gretton12a.pdf)

- Conditional test: Massiani, P.-F., et al. (2025). *A Kernel Conditional Two-Sample Test*. [arXiv](https://arxiv.org/abs/2506.03898) · [PDF](https://arxiv.org/pdf/2506.03898)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/lukashaverbeck/rkhs",
    "name": "rkhs",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.13",
    "maintainer_email": null,
    "keywords": "kernels, jax, two-sample-test",
    "author": "Lukas Haverbeck",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/04/1e/f1dc3e2ee4e01e4f0486eaac53c7a02179c07d5e56341956b6ec67943c54/rkhs-0.0.1.tar.gz",
    "platform": null,
    "description": "# \ud83c\udf31 rkhs\n\n`rkhs` is a small Python framework for marginal and conditional two-sample testing with kernels in JAX.\n\n---\n\n## \u2699\ufe0f Installation\n```bash\npip install rkhs\n```\n\n---\n\n## \ud83d\ude80 Features\n`rkhs` provides JAX-native two-sample tests (marginal, conditional, mixed) based on kernel embeddings with analytical or bootstrap confidence bounds, a simple API, and pluggable kernels.\n\n### Three test modes \u2014 one API \nYou can test the following two-sample hypotheses with one common set of primitives: \n  - **Marginal:** $H_0: P = Q$\n  - **Conditional:** $H_0(x_1,x_2): P(\\cdot\\mid X=x_1) = Q(\\cdot\\mid X=x_2)$\n  - **Mixed:** $H_0(x): P(\\cdot\\mid X=x) = Q$\n\nThe test compares kernel embeddings in RKHS norm and rejects $H_0$ at level $\\alpha$ if\n\n$$\n  \\|\\hat\\mu_P - \\hat\\mu_Q\\|_\\mathcal{H} > \\beta_P + \\beta_Q \\quad,\n$$\n\nwhere $\\beta_\\ast$ are finite-sample confidence radii from the selected regime.\n\n### Confidence regimes\n**Analytical bounds.** Finite-sample guarantees under the stated assumptions (conservative, little overhead).\n\n**Bootstrap bounds.** Data-driven thresholds with typically higher power (cost scales with the number of resamples).\n\n### JAX integration\nWorks with `jit`/`vmap`, runs on CPU/GPU/TPU, and uses explicit `PRNGKey` for reproducibility.\n\n### Kernels\nPopular kernels are built in: `Gaussian`, `Matern`, `Laplacian`, `Polynomial`, `Linear`.\n\nConditional tests use a scalar kernel on the input domain and a separate kernel on the output domain: \n`VectorKernel(x=..., y=..., regularization=...)`.\n\n---\n\n## \ud83e\udde9 Usage\n\n### 1) Marginal two-sample test (analytical bounds)\n\n```python\nimport jax\nfrom rkhs.testing import TestEmbedding, TwoSampleTest\nfrom rkhs.kernels import GaussianKernel\n\n# toy data: two 3D Gaussians with different means\nxs_1 = jax.random.normal(key=jax.random.key(1), shape=(200, 3))\nxs_2 = jax.random.normal(key=jax.random.key(2), shape=(200, 3)) + 1.0\n\n# kernel on the sample space\nkernel = GaussianKernel(bandwidth=1.5, data_shape=(3,))\n\n# embedding + analytical confidence radius\nkme_1 = TestEmbedding.analytical(\n    kme=kernel.kme(xs_1),   # embed dataset in RKHS\n    kernel_bound=1.0        # sup_x k(x, x)\n)\nkme_2 = TestEmbedding.analytical(\n    kme=kernel.kme(xs_2),   # embed dataset in RKHS\n    kernel_bound=1.0        # sup_x k(x, x)\n) \n\n# level-\u03b1 test\ntest = TwoSampleTest.from_embeddings(kme_1, kme_2, level=0.05)\n\ndecision = test.reject      # boolean (reject H_0?)\ndistance = test.distance    # RKHS distance\nthreshold = test.threshold  # \u03b2_P + \u03b2_Q\nprint(decision, distance, threshold)\n```\n### 2) Marginal test (bootstrap bounds)\n\n```python\nimport jax\nfrom rkhs.testing import TestEmbedding, TwoSampleTest\nfrom rkhs.kernels import GaussianKernel\n\n# toy data: two 3D Gaussians with different means\nxs_1 = jax.random.normal(key=jax.random.key(1), shape=(200, 3))\nxs_2 = jax.random.normal(key=jax.random.key(2), shape=(200, 3)) + 1.0\n\n# kernel on the sample space\nkernel = GaussianKernel(bandwidth=1.5, data_shape=(3,))\n\n# embedding + analytical confidence radius\nkme_1 = TestEmbedding.bootstrap(\n    kme=kernel.kme(xs_1),   # embed dataset in RKHS\n    key=jax.random.key(3),  # random key\n    n_bootstrap=1000        # number of bootstrap resamples\n)\nkme_2 = TestEmbedding.bootstrap(\n    kme=kernel.kme(xs_2),   # embed dataset in RKHS\n    key=jax.random.key(4),  # random key\n    n_bootstrap=1000        # number of bootstrap resamples\n) \n\n# level-\u03b1 test\ntest = TwoSampleTest.from_embeddings(kme_1, kme_2, level=0.05)\n\ndecision = test.reject      # boolean (reject H_0?)\ndistance = test.distance    # RKHS distance\nthreshold = test.threshold  # \u03b2_P + \u03b2_Q\nprint(decision, distance, threshold)\n```\n\n### 3) Conditional two-sample test at selected covariates\n\n```python\nimport jax\nfrom rkhs import VectorKernel\nfrom rkhs.testing import ConditionalTestEmbedding, TwoSampleTest\nfrom rkhs.kernels import GaussianKernel\n\n# synthetic x,y pairs with additive noise\nxs_1 = jax.random.normal(key=jax.random.key(1), shape=(1000, 3))\nys_1 = xs_1 + jax.random.normal(key=jax.random.key(2), shape=(1000, 3)) * 0.05\n\nxs_2 = jax.random.normal(key=jax.random.key(3), shape=(1000, 3))\nys_2 = xs_2 + jax.random.normal(key=jax.random.key(4), shape=(1000, 3)) * 0.05 + 0.5\n\n# inputs at which to test for distributional equality: H_0(x): P(. | x) =? Q(. | x), for x in grid\ncovariates = jax.numpy.linspace(jax.numpy.array([-3, -3, -3]), jax.numpy.array([3, 3, 3]), num=100)\n\n# vector-valued kernel over inputs X and outputs Y\nkernel = VectorKernel(\n    x=GaussianKernel(bandwidth=0.5, data_shape=(3,)),  # kernel used for ridge regression\n    y=GaussianKernel(bandwidth=1.0, data_shape=(3,)),  # kernel used for embedding marginal distribution at each covariate\n    regularization=0.1\n)\n\n# conditional embedding + bootstrap confidence radius\ncme_1 = ConditionalTestEmbedding.bootstrap(\n    cme=kernel.cme(xs_1, ys_1), # embed dataset in vector-valued RKHS\n    grid=covariates,            # covariates used in bootstrap of threshold parameters\n    key=jax.random.key(5),      # random key\n    n_bootstrap=100             # number of bootstrap resamples\n)\n\ncme_2 = ConditionalTestEmbedding.bootstrap(\n    cme=kernel.cme(xs_2, ys_2), # embed dataset in vector-valued RKHS\n    grid=covariates,            # covariates used in bootstrap of threshold parameters\n    key=jax.random.key(6),      # random key\n    n_bootstrap=100             # number of bootstrap resamples\n)\n\n# evaluate CMEs at covariates -> embeds each distribution over Y in RKHS of `kernel.y`\nkme_1 = cme_1(covariates)\nkme_2 = cme_2(covariates)\n\n# batched test across all covariates\ntest = TwoSampleTest.from_embeddings(kme_1, kme_2, level=0.05)\nreject_per_x = test.reject   # Boolean array\n\ndecision = test.reject      # boolean (reject H_0(x)?, individually for each covariate). shape: covariates.shape\ndistance = test.distance    # RKHS distance. shape: covariates.shape\nthreshold = test.threshold  # \u03b2_P + \u03b2_Q\nprint(decision, distance, threshold)\n```\n\n### 4) Mixed test: $P(\\cdot\\mid X=x)$ vs. $Q$\n\n```python\nimport jax\nfrom rkhs import VectorKernel\nfrom rkhs.testing import TestEmbedding, ConditionalTestEmbedding, TwoSampleTest\nfrom rkhs.kernels import GaussianKernel\n\n# dataset from marginal distribution over Y\nys_1 = jax.random.normal(key=jax.random.key(1), shape=(1000, 3)) * 0.05\n\n# synthetic x,y pairs with additive noise\nxs_2 = jax.random.normal(key=jax.random.key(2), shape=(1000, 3))\nys_2 = xs_2 + jax.random.normal(key=jax.random.key(3), shape=(1000, 3)) * 0.05\n\n# inputs at which to test for distributional equality: H_0(x): P =? Q(. | x), for x in grid\ncovariates = jax.numpy.linspace(jax.numpy.array([-3, -3, -3]), jax.numpy.array([3, 3, 3]), num=200)\n\ny_kernel = GaussianKernel(bandwidth=1.0, data_shape=(3,))\n\n# vector-valued kernel over inputs X and outputs Y\nvector_kernel = VectorKernel(\n    x=GaussianKernel(bandwidth=0.5, data_shape=(3,)),  # kernel used for ridge regression\n    y=y_kernel,                                        # kernel used for embedding marginal distribution at each covariate\n    regularization=0.1\n)\n\n# embedding + analytical confidence radius (can be drop-in replaced with bootstrap radius)\nkme_1 = TestEmbedding.analytical(\n    kme=y_kernel.kme(ys_1), # embed dataset in RKHS\n    kernel_bound=1.0        # sup_x k(x, x)\n)\n\n# conditional embedding + analytical confidence radius\ncme_2 = ConditionalTestEmbedding.bootstrap(\n    cme=vector_kernel.cme(xs_2, ys_2),  # embed dataset in vector-valued RKHS\n    grid=covariates,                    # covariates used in bootstrap of threshold parameters\n    key=jax.random.key(4),              # random key\n    n_bootstrap=100                     # number of bootstrap resamples\n)\n\n# evaluate CME at covariates -> embeds each distribution over Y in RKHS of `kernel.y`\nkme_2 = cme_2(covariates)\n\n# batched test across all covariates\ntest = TwoSampleTest.from_embeddings(kme_1, kme_2, level=0.05)\nreject_per_x = test.reject   # Boolean array\n\ndecision = test.reject      # boolean (reject H_0(x)?, individually for each covariate). shape: covariates.shape\ndistance = test.distance    # RKHS distance. shape: covariates.shape\nthreshold = test.threshold  # \u03b2_P + \u03b2_Q\nprint(decision, distance, threshold)\n```\n\n---\n\n## \ud83d\udd0d Kernel quick reference\n\n- `LinearKernel` \u2014 compares means (first moment).\n- `PolynomialKernel(degree=d)` \u2014 compares moments up to degree $d$.\n- `Gaussian`, `Matern`, `Laplacian` \u2014 characteristic; compare full distributions.\n\nFor conditional tests:\n- **Input kernel (`x`)**: used to learn the conditional embedding (not for comparison).\n- **Output kernel (`y`)**: determines what aspects of the conditional law are compared.\n\n---\n\n## \ud83e\udde0 Notes\n- Embeddings preserve batch axes; passing a batch of covariates returns a batch of embeddings.\n- All randomness is explicit via `jax.random.PRNGKey`.\n- You can use your own custom kernel by extending `rkhs.Kernel`:\n\n```python\nfrom jax import Array\nfrom rkhs import Kernel\nimport jax\n\nclass MyCustomKernel(Kernel):\n    def __init__(self, data_shape: tuple[int, ...]):\n        super().__init__(data_shape)\n        ...\n    \n    def _dot(self, x1: Array, x2: Array) -> Array:\n        ...  # your logic here (must be jit-compilable)\n```\n\n---\n\n## \ud83d\udcda References\n\n- Marginal test: Gretton, A., et al. (2012). *A Kernel Two-Sample Test*. [JMLR page](https://jmlr.csail.mit.edu/papers/v13/gretton12a.html) \u00b7 [PDF](https://www.jmlr.org/papers/volume13/gretton12a/gretton12a.pdf)\n\n- Conditional test: Massiani, P.-F., et al. (2025). *A Kernel Conditional Two-Sample Test*. [arXiv](https://arxiv.org/abs/2506.03898) \u00b7 [PDF](https://arxiv.org/pdf/2506.03898)\n",
    "bugtrack_url": null,
    "license": "MIT License\n        \n        Copyright (c) 2024 Lukas Haverbeck\n        \n        Permission is hereby granted, free of charge, to any person obtaining a copy\n        of this software and associated documentation files (the \"Software\"), to deal\n        in the Software without restriction, including without limitation the rights\n        to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n        copies of the Software, and to permit persons to whom the Software is\n        furnished to do so, subject to the following conditions:\n        \n        The above copyright notice and this permission notice shall be included in all\n        copies or substantial portions of the Software.\n        \n        THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n        IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n        FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n        AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n        LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n        OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n        SOFTWARE.\n        ",
    "summary": "A unified framework for marginal and conditional two-sample testing with kernels in JAX.",
    "version": "0.0.1",
    "project_urls": {
        "Homepage": "https://github.com/lukashaverbeck/rkhs",
        "Issues": "https://github.com/lukashaverbeck/rkhs/issues"
    },
    "split_keywords": [
        "kernels",
        " jax",
        " two-sample-test"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "0585a37aefe7019da6ad1559d993d7f0c47be7cb7f56313003b16c0b190add89",
                "md5": "e2c47a94ba0dbc0aa7bd05d644fc606b",
                "sha256": "899d4a078dd4c7e703f4ff14b188a1fd53ea114339b99128e75a45a25285ff82"
            },
            "downloads": -1,
            "filename": "rkhs-0.0.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e2c47a94ba0dbc0aa7bd05d644fc606b",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.13",
            "size": 16517,
            "upload_time": "2025-08-27T11:07:05",
            "upload_time_iso_8601": "2025-08-27T11:07:05.258779Z",
            "url": "https://files.pythonhosted.org/packages/05/85/a37aefe7019da6ad1559d993d7f0c47be7cb7f56313003b16c0b190add89/rkhs-0.0.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "041ef1dc3e2ee4e01e4f0486eaac53c7a02179c07d5e56341956b6ec67943c54",
                "md5": "f2e9935a935199dda1486a17c638526f",
                "sha256": "c3b9e12d0828faf57ef72bfb1f4cc7de9afd03f8a4c2a8ec6b802dd66915d8c1"
            },
            "downloads": -1,
            "filename": "rkhs-0.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "f2e9935a935199dda1486a17c638526f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.13",
            "size": 19032,
            "upload_time": "2025-08-27T11:07:06",
            "upload_time_iso_8601": "2025-08-27T11:07:06.784469Z",
            "url": "https://files.pythonhosted.org/packages/04/1e/f1dc3e2ee4e01e4f0486eaac53c7a02179c07d5e56341956b6ec67943c54/rkhs-0.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-27 11:07:06",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "lukashaverbeck",
    "github_project": "rkhs",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "rkhs"
}
        
Elapsed time: 1.67176s