free-transformer


Namefree-transformer JSON
Version 0.1.2 PyPI version JSON
download
home_pageNone
SummaryA Llama-style decoder architecture with explicit latent plans and conditional VAE training
upload_time2025-11-02 18:44:20
maintainerNone
docs_urlNone
authorNone
requires_python>=3.11
licenseMIT
keywords transformer vae deep-learning pytorch llm language-model latent-planning conditional-vae autoregressive text-generation
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Free Transformer

[![PyPI version](https://badge.fury.io/py/free-transformer.svg)](https://badge.fury.io/py/free-transformer)
[![Python 3.11+](https://img.shields.io/badge/python-3.11+-blue.svg)](https://www.python.org/downloads/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![Documentation](https://img.shields.io/badge/docs-github--pages-blue)](https://udapy.github.io/free-transformer/)
[![Tests](https://github.com/udapy/free-transformer/workflows/Tests/badge.svg)](https://github.com/udapy/free-transformer/actions)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)

**Free Transformer**: A Llama-style decoder architecture with explicit latent plans, conditional VAE training, and benchmark comparisons against standard Transformers.

Designed for efficient PyTorch training on modern GPUs with full FSDP support and modern optimizations.

> πŸ“– **[Complete Documentation](https://udapy.github.io/free-transformer/)** | πŸš€ **[Quick Start Guide](https://udapy.github.io/free-transformer/getting-started/quick-start/)** | πŸ—οΈ **[Architecture Details](https://udapy.github.io/free-transformer/architecture/overview/)**

---

## What Is the Free Transformer?

Traditional autoregressive Transformers generate each token by conditioning only on the sequence so far ("reactive" behavior).
**Free Transformer** introduces a latent planning mechanismβ€”first choosing a stochastic abstract plan (`Z`), then generating tokens to fit that plan.  
This scalable conditional VAE architecture maintains high-level coherence, improves controllable generation, and enables richer sequence modeling.

### Architecture Overview

![alt text](image.png)

---

## Features

### πŸ—οΈ **Architecture**
- **Llama-style backbone**: RMSNorm, SwiGLU, RoPE, Grouped-Query Attention (GQA)
- **Latent Planning**: Explicit plan variable `Z` with differentiable binary coding
- **Conditional VAE**: Reconstruction + KL loss with free bits regularization

### ⚑ **Performance & Scaling**
- **FSDP Support**: Multi-GPU training with PyTorch Fully Sharded Data Parallel
- **Mixed Precision**: Automatic Mixed Precision (AMP) with gradient scaling
- **Memory Efficient**: Gradient checkpointing and optimized attention patterns
- **Modern Optimizations**: bfloat16, efficient parameter sharding

### πŸ”§ **Development & Training**
- **Flexible Training**: Switchable inference/training flows with mode selection
- **Synthetic + Real Data**: Fast prototyping with built-in synthetic data generation
- **Comprehensive Testing**: Unit/integration tests, benchmark comparisons
- **Quality Assurance**: Type checking, linting, formatting, CI-ready

### πŸ“¦ **Usability**
- **Extensible API**: Modular classes, CLI scripts, YAML configuration
- **Docker Support**: Containerized demos and development environment
- **Documentation**: API references, architecture guides, examples

---

## Installation

### From PyPI (Recommended)

```bash
pip install free-transformer
```

### From Source

Using [UV](https://github.com/astral-sh/uv) (recommended):

```bash
# Install UV
curl -LsSf https://astral.sh/uv/install.sh | sh

# Clone and install
git clone https://github.com/udapy/free-transformer.git
cd free-transformer
uv venv --python 3.12
source .venv/bin/activate
uv pip install -e ".[dev]"
```

Using pip:

```bash
git clone https://github.com/udapy/free-transformer.git
cd free-transformer
pip install -e ".[dev]"
```

> πŸ“‹ **Detailed installation instructions**: [Installation Guide](https://udapy.github.io/free-transformer/getting-started/installation/)

---

## Quick Start

### 🐳 Docker (Fastest)

The fastest way to try Free Transformer:

```bash
git clone https://github.com/udapy/free-transformer.git
cd free-transformer
docker-compose up free-transformer-demo
```

### 🐍 Python API

```python
from free_transformer import FreeTransformer, ModelConfig

# Create and train a model
config = ModelConfig(vocab_size=1000, hidden_dim=128, num_layers=6, latent_dim=8)
model = FreeTransformer(config)

# Training mode
import torch
tokens = torch.randint(0, 1000, (2, 128))
logits, z_logits = model(tokens, mode='training')

# Generation
generated = model.generate(tokens[:, :10], max_new_tokens=20)
```

### πŸš€ Command Line

```bash
# Generate synthetic data and run demo
make demo

# Train models separately
make train-baseline  # Standard Transformer
make train-free      # Free Transformer
make compare         # Compare results
```

> 🎯 **Complete tutorial**: [Quick Start Guide](https://udapy.github.io/free-transformer/getting-started/quick-start/)

---

## Manual Installation & Quick Start Demo

1. **Generate Small Synthetic Data**
   ```bash
   make generate-data-small
   ```

2. **Train Baseline Transformer**
   ```bash
   make train-baseline
   ```

3. **Train Free Transformer**
   ```bash
   make train-free
   ```

4. **Run Model Comparison**
   ```bash
   make compare
   ```

Or run the full pipeline:

```bash
make demo
```

Check results in:
- `checkpoints/baseline/`
- `checkpoints/free/`
- `results/comparison/results.json`

---

## Key Features Comparison

| Feature | Standard Transformer | Free Transformer |
|---------|---------------------|------------------|
| **Generation** | Reactive (token-by-token) | Plan-then-generate |
| **Coherence** | Local | Global + Local |
| **Controllability** | Limited | High (via plan manipulation) |
| **Training** | Cross-entropy loss | Conditional VAE loss |
| **Memory** | Baseline | +10-15% (inference) |
| **Speed** | Baseline | -5-10% (inference) |

> πŸ”¬ **Detailed comparison**: [Architecture Overview](https://udapy.github.io/free-transformer/architecture/overview/)

---

## Repository Structure

```
free-transformer/
β”œβ”€β”€ src/free_transformer/
β”‚   β”œβ”€β”€ model.py
β”‚   β”œβ”€β”€ baseline.py
β”‚   β”œβ”€β”€ encoder.py
β”‚   β”œβ”€β”€ latent.py
β”‚   β”œβ”€β”€ injection.py
β”‚   β”œβ”€β”€ losses.py
β”‚   β”œβ”€β”€ synthetic_data.py
β”‚   β”œβ”€β”€ train_utils.py
β”‚   └── config.py
β”œβ”€β”€ examples/
β”‚   β”œβ”€β”€ train_baseline.py
β”‚   β”œβ”€β”€ train_free.py
β”‚   β”œβ”€β”€ eval_compare.py
β”‚   └── generate_data.py
β”œβ”€β”€ configs/
β”‚   β”œβ”€β”€ baseline.yaml
β”‚   └── free_transformer.yaml
β”œβ”€β”€ docker/
β”‚   β”œβ”€β”€ demo.sh
β”‚   └── README.md
β”œβ”€β”€ tests/
β”‚   β”œβ”€β”€ unit/
β”‚   β”œβ”€β”€ integration/
β”‚   └── test_comparison.py
β”œβ”€β”€ docs/
β”œβ”€β”€ Dockerfile
β”œβ”€β”€ Dockerfile.cpu
β”œβ”€β”€ docker-compose.yml
β”œβ”€β”€ Makefile
β”œβ”€β”€ pyproject.toml
β”œβ”€β”€ .python-version
β”œβ”€β”€ LICENSE
└── README.md
```

---

## Testing & Quality

Run all tests:

```bash
make test
```

Quality checks:

```bash
make quality
```

---

## Advanced Features

### πŸš€ **Multi-GPU Training**
```bash
# FSDP training with automatic GPU detection
make train-free-fsdp

# Custom distributed training
torchrun --nproc_per_node=auto examples/train_free.py --use-fsdp
```

### πŸ“Š **Flexible Data**
- HuggingFace datasets integration
- Built-in synthetic data generation
- Custom data loading pipelines

### πŸ”§ **Extensible Architecture**
- Modular components for easy customization
- Custom loss functions and training schedules
- Plugin system for new features

> πŸ“š **Learn more**: [Training Guide](https://udapy.github.io/free-transformer/training/guide/) | [Multi-GPU Setup](https://udapy.github.io/free-transformer/training/multi-gpu/)

---

## Documentation

πŸ“– **[Complete Documentation](https://udapy.github.io/free-transformer/)**

### Quick Links
- πŸš€ [**Getting Started**](https://udapy.github.io/free-transformer/getting-started/installation/) - Installation and setup
- πŸ—οΈ [**Architecture**](https://udapy.github.io/free-transformer/architecture/overview/) - How Free Transformer works
- 🎯 [**Training Guide**](https://udapy.github.io/free-transformer/training/guide/) - Training best practices
- πŸ“‹ [**API Reference**](https://udapy.github.io/free-transformer/api/model/) - Complete API documentation
- πŸ’‘ [**Examples**](https://udapy.github.io/free-transformer/examples/basic/) - Code examples and tutorials
- ❓ [**FAQ**](https://udapy.github.io/free-transformer/faq/) - Frequently asked questions

### Local Documentation
```bash
# Serve documentation locally
make docs-serve
# Open http://127.0.0.1:8000
```

---

## License

MIT License β€” see `LICENSE`

---

## Contributing

We welcome contributions! Please see our [Contributing Guide](https://udapy.github.io/free-transformer/development/contributing/) for details.

### Quick Development Setup
```bash
git clone https://github.com/udapy/free-transformer.git
cd free-transformer
make install-all  # Install with all dependencies
make test         # Run tests
make quality      # Check code quality
```

### Before Submitting
- βœ… Tests pass: `make test`
- βœ… Code quality: `make quality`  
- βœ… Documentation builds: `make docs-build`

> πŸ“‹ **Full guidelines**: [Contributing Guide](https://udapy.github.io/free-transformer/development/contributing/)

---

## FAQ

**Can I use this for real-world (non-synthetic) data?**  
Yes! Edit configs and use HuggingFace datasets.

**How do I run distributed training?**  
Use provided CLI flags or edit config. See docs and Makefile.

**How do I change architecture parameters?**  
Edit YAML config files for layer size, latent dim, number of blocks, etc.

**Can I run this without installing dependencies locally?**  
Yes! Use Docker: `docker-compose up free-transformer-demo` for a complete demo.

**What if I don't have a GPU?**  
Use the CPU Docker image: `make docker-build-cpu && make docker-run-cpu`

---

## Citation

If you use Free Transformer in your research, please cite:

```bibtex
@software{free_transformer,
  title={Free Transformer: Explicit Latent Planning for Autoregressive Generation},
  author={Phalak, Uday},
  year={2024},
  url={https://github.com/udapy/free-transformer},
  version={0.1.0}
}
```

## Links

- πŸ“¦ [**PyPI Package**](https://pypi.org/project/free-transformer/)
- πŸ“– [**Documentation**](https://udapy.github.io/free-transformer/)
- πŸ› [**Issues**](https://github.com/udapy/free-transformer/issues)
- πŸ’¬ [**Discussions**](https://github.com/udapy/free-transformer/discussions)

---

<div align="center">

**Free Transformer** - Bringing explicit planning to autoregressive generation

[Documentation](https://udapy.github.io/free-transformer/) β€’ [PyPI](https://pypi.org/project/free-transformer/) β€’ [GitHub](https://github.com/udapy/free-transformer)

</div>

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "free-transformer",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.11",
    "maintainer_email": "Uday Phalak <iam.udayphalak@gmail.com>",
    "keywords": "transformer, vae, deep-learning, pytorch, llm, language-model, latent-planning, conditional-vae, autoregressive, text-generation",
    "author": null,
    "author_email": "Uday Phalak <iam.udayphalak@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/9b/7b/c58ac7755a0d05b28b25bd5c518e13ad53f20a8c161a64592591a964e5da/free_transformer-0.1.2.tar.gz",
    "platform": null,
    "description": "# Free Transformer\n\n[![PyPI version](https://badge.fury.io/py/free-transformer.svg)](https://badge.fury.io/py/free-transformer)\n[![Python 3.11+](https://img.shields.io/badge/python-3.11+-blue.svg)](https://www.python.org/downloads/)\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n[![Documentation](https://img.shields.io/badge/docs-github--pages-blue)](https://udapy.github.io/free-transformer/)\n[![Tests](https://github.com/udapy/free-transformer/workflows/Tests/badge.svg)](https://github.com/udapy/free-transformer/actions)\n[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\n\n**Free Transformer**: A Llama-style decoder architecture with explicit latent plans, conditional VAE training, and benchmark comparisons against standard Transformers.\n\nDesigned for efficient PyTorch training on modern GPUs with full FSDP support and modern optimizations.\n\n> \ud83d\udcd6 **[Complete Documentation](https://udapy.github.io/free-transformer/)** | \ud83d\ude80 **[Quick Start Guide](https://udapy.github.io/free-transformer/getting-started/quick-start/)** | \ud83c\udfd7\ufe0f **[Architecture Details](https://udapy.github.io/free-transformer/architecture/overview/)**\n\n---\n\n## What Is the Free Transformer?\n\nTraditional autoregressive Transformers generate each token by conditioning only on the sequence so far (\"reactive\" behavior).\n**Free Transformer** introduces a latent planning mechanism\u2014first choosing a stochastic abstract plan (`Z`), then generating tokens to fit that plan.  \nThis scalable conditional VAE architecture maintains high-level coherence, improves controllable generation, and enables richer sequence modeling.\n\n### Architecture Overview\n\n![alt text](image.png)\n\n---\n\n## Features\n\n### \ud83c\udfd7\ufe0f **Architecture**\n- **Llama-style backbone**: RMSNorm, SwiGLU, RoPE, Grouped-Query Attention (GQA)\n- **Latent Planning**: Explicit plan variable `Z` with differentiable binary coding\n- **Conditional VAE**: Reconstruction + KL loss with free bits regularization\n\n### \u26a1 **Performance & Scaling**\n- **FSDP Support**: Multi-GPU training with PyTorch Fully Sharded Data Parallel\n- **Mixed Precision**: Automatic Mixed Precision (AMP) with gradient scaling\n- **Memory Efficient**: Gradient checkpointing and optimized attention patterns\n- **Modern Optimizations**: bfloat16, efficient parameter sharding\n\n### \ud83d\udd27 **Development & Training**\n- **Flexible Training**: Switchable inference/training flows with mode selection\n- **Synthetic + Real Data**: Fast prototyping with built-in synthetic data generation\n- **Comprehensive Testing**: Unit/integration tests, benchmark comparisons\n- **Quality Assurance**: Type checking, linting, formatting, CI-ready\n\n### \ud83d\udce6 **Usability**\n- **Extensible API**: Modular classes, CLI scripts, YAML configuration\n- **Docker Support**: Containerized demos and development environment\n- **Documentation**: API references, architecture guides, examples\n\n---\n\n## Installation\n\n### From PyPI (Recommended)\n\n```bash\npip install free-transformer\n```\n\n### From Source\n\nUsing [UV](https://github.com/astral-sh/uv) (recommended):\n\n```bash\n# Install UV\ncurl -LsSf https://astral.sh/uv/install.sh | sh\n\n# Clone and install\ngit clone https://github.com/udapy/free-transformer.git\ncd free-transformer\nuv venv --python 3.12\nsource .venv/bin/activate\nuv pip install -e \".[dev]\"\n```\n\nUsing pip:\n\n```bash\ngit clone https://github.com/udapy/free-transformer.git\ncd free-transformer\npip install -e \".[dev]\"\n```\n\n> \ud83d\udccb **Detailed installation instructions**: [Installation Guide](https://udapy.github.io/free-transformer/getting-started/installation/)\n\n---\n\n## Quick Start\n\n### \ud83d\udc33 Docker (Fastest)\n\nThe fastest way to try Free Transformer:\n\n```bash\ngit clone https://github.com/udapy/free-transformer.git\ncd free-transformer\ndocker-compose up free-transformer-demo\n```\n\n### \ud83d\udc0d Python API\n\n```python\nfrom free_transformer import FreeTransformer, ModelConfig\n\n# Create and train a model\nconfig = ModelConfig(vocab_size=1000, hidden_dim=128, num_layers=6, latent_dim=8)\nmodel = FreeTransformer(config)\n\n# Training mode\nimport torch\ntokens = torch.randint(0, 1000, (2, 128))\nlogits, z_logits = model(tokens, mode='training')\n\n# Generation\ngenerated = model.generate(tokens[:, :10], max_new_tokens=20)\n```\n\n### \ud83d\ude80 Command Line\n\n```bash\n# Generate synthetic data and run demo\nmake demo\n\n# Train models separately\nmake train-baseline  # Standard Transformer\nmake train-free      # Free Transformer\nmake compare         # Compare results\n```\n\n> \ud83c\udfaf **Complete tutorial**: [Quick Start Guide](https://udapy.github.io/free-transformer/getting-started/quick-start/)\n\n---\n\n## Manual Installation & Quick Start Demo\n\n1. **Generate Small Synthetic Data**\n   ```bash\n   make generate-data-small\n   ```\n\n2. **Train Baseline Transformer**\n   ```bash\n   make train-baseline\n   ```\n\n3. **Train Free Transformer**\n   ```bash\n   make train-free\n   ```\n\n4. **Run Model Comparison**\n   ```bash\n   make compare\n   ```\n\nOr run the full pipeline:\n\n```bash\nmake demo\n```\n\nCheck results in:\n- `checkpoints/baseline/`\n- `checkpoints/free/`\n- `results/comparison/results.json`\n\n---\n\n## Key Features Comparison\n\n| Feature | Standard Transformer | Free Transformer |\n|---------|---------------------|------------------|\n| **Generation** | Reactive (token-by-token) | Plan-then-generate |\n| **Coherence** | Local | Global + Local |\n| **Controllability** | Limited | High (via plan manipulation) |\n| **Training** | Cross-entropy loss | Conditional VAE loss |\n| **Memory** | Baseline | +10-15% (inference) |\n| **Speed** | Baseline | -5-10% (inference) |\n\n> \ud83d\udd2c **Detailed comparison**: [Architecture Overview](https://udapy.github.io/free-transformer/architecture/overview/)\n\n---\n\n## Repository Structure\n\n```\nfree-transformer/\n\u251c\u2500\u2500 src/free_transformer/\n\u2502   \u251c\u2500\u2500 model.py\n\u2502   \u251c\u2500\u2500 baseline.py\n\u2502   \u251c\u2500\u2500 encoder.py\n\u2502   \u251c\u2500\u2500 latent.py\n\u2502   \u251c\u2500\u2500 injection.py\n\u2502   \u251c\u2500\u2500 losses.py\n\u2502   \u251c\u2500\u2500 synthetic_data.py\n\u2502   \u251c\u2500\u2500 train_utils.py\n\u2502   \u2514\u2500\u2500 config.py\n\u251c\u2500\u2500 examples/\n\u2502   \u251c\u2500\u2500 train_baseline.py\n\u2502   \u251c\u2500\u2500 train_free.py\n\u2502   \u251c\u2500\u2500 eval_compare.py\n\u2502   \u2514\u2500\u2500 generate_data.py\n\u251c\u2500\u2500 configs/\n\u2502   \u251c\u2500\u2500 baseline.yaml\n\u2502   \u2514\u2500\u2500 free_transformer.yaml\n\u251c\u2500\u2500 docker/\n\u2502   \u251c\u2500\u2500 demo.sh\n\u2502   \u2514\u2500\u2500 README.md\n\u251c\u2500\u2500 tests/\n\u2502   \u251c\u2500\u2500 unit/\n\u2502   \u251c\u2500\u2500 integration/\n\u2502   \u2514\u2500\u2500 test_comparison.py\n\u251c\u2500\u2500 docs/\n\u251c\u2500\u2500 Dockerfile\n\u251c\u2500\u2500 Dockerfile.cpu\n\u251c\u2500\u2500 docker-compose.yml\n\u251c\u2500\u2500 Makefile\n\u251c\u2500\u2500 pyproject.toml\n\u251c\u2500\u2500 .python-version\n\u251c\u2500\u2500 LICENSE\n\u2514\u2500\u2500 README.md\n```\n\n---\n\n## Testing & Quality\n\nRun all tests:\n\n```bash\nmake test\n```\n\nQuality checks:\n\n```bash\nmake quality\n```\n\n---\n\n## Advanced Features\n\n### \ud83d\ude80 **Multi-GPU Training**\n```bash\n# FSDP training with automatic GPU detection\nmake train-free-fsdp\n\n# Custom distributed training\ntorchrun --nproc_per_node=auto examples/train_free.py --use-fsdp\n```\n\n### \ud83d\udcca **Flexible Data**\n- HuggingFace datasets integration\n- Built-in synthetic data generation\n- Custom data loading pipelines\n\n### \ud83d\udd27 **Extensible Architecture**\n- Modular components for easy customization\n- Custom loss functions and training schedules\n- Plugin system for new features\n\n> \ud83d\udcda **Learn more**: [Training Guide](https://udapy.github.io/free-transformer/training/guide/) | [Multi-GPU Setup](https://udapy.github.io/free-transformer/training/multi-gpu/)\n\n---\n\n## Documentation\n\n\ud83d\udcd6 **[Complete Documentation](https://udapy.github.io/free-transformer/)**\n\n### Quick Links\n- \ud83d\ude80 [**Getting Started**](https://udapy.github.io/free-transformer/getting-started/installation/) - Installation and setup\n- \ud83c\udfd7\ufe0f [**Architecture**](https://udapy.github.io/free-transformer/architecture/overview/) - How Free Transformer works\n- \ud83c\udfaf [**Training Guide**](https://udapy.github.io/free-transformer/training/guide/) - Training best practices\n- \ud83d\udccb [**API Reference**](https://udapy.github.io/free-transformer/api/model/) - Complete API documentation\n- \ud83d\udca1 [**Examples**](https://udapy.github.io/free-transformer/examples/basic/) - Code examples and tutorials\n- \u2753 [**FAQ**](https://udapy.github.io/free-transformer/faq/) - Frequently asked questions\n\n### Local Documentation\n```bash\n# Serve documentation locally\nmake docs-serve\n# Open http://127.0.0.1:8000\n```\n\n---\n\n## License\n\nMIT License \u2014 see `LICENSE`\n\n---\n\n## Contributing\n\nWe welcome contributions! Please see our [Contributing Guide](https://udapy.github.io/free-transformer/development/contributing/) for details.\n\n### Quick Development Setup\n```bash\ngit clone https://github.com/udapy/free-transformer.git\ncd free-transformer\nmake install-all  # Install with all dependencies\nmake test         # Run tests\nmake quality      # Check code quality\n```\n\n### Before Submitting\n- \u2705 Tests pass: `make test`\n- \u2705 Code quality: `make quality`  \n- \u2705 Documentation builds: `make docs-build`\n\n> \ud83d\udccb **Full guidelines**: [Contributing Guide](https://udapy.github.io/free-transformer/development/contributing/)\n\n---\n\n## FAQ\n\n**Can I use this for real-world (non-synthetic) data?**  \nYes! Edit configs and use HuggingFace datasets.\n\n**How do I run distributed training?**  \nUse provided CLI flags or edit config. See docs and Makefile.\n\n**How do I change architecture parameters?**  \nEdit YAML config files for layer size, latent dim, number of blocks, etc.\n\n**Can I run this without installing dependencies locally?**  \nYes! Use Docker: `docker-compose up free-transformer-demo` for a complete demo.\n\n**What if I don't have a GPU?**  \nUse the CPU Docker image: `make docker-build-cpu && make docker-run-cpu`\n\n---\n\n## Citation\n\nIf you use Free Transformer in your research, please cite:\n\n```bibtex\n@software{free_transformer,\n  title={Free Transformer: Explicit Latent Planning for Autoregressive Generation},\n  author={Phalak, Uday},\n  year={2024},\n  url={https://github.com/udapy/free-transformer},\n  version={0.1.0}\n}\n```\n\n## Links\n\n- \ud83d\udce6 [**PyPI Package**](https://pypi.org/project/free-transformer/)\n- \ud83d\udcd6 [**Documentation**](https://udapy.github.io/free-transformer/)\n- \ud83d\udc1b [**Issues**](https://github.com/udapy/free-transformer/issues)\n- \ud83d\udcac [**Discussions**](https://github.com/udapy/free-transformer/discussions)\n\n---\n\n<div align=\"center\">\n\n**Free Transformer** - Bringing explicit planning to autoregressive generation\n\n[Documentation](https://udapy.github.io/free-transformer/) \u2022 [PyPI](https://pypi.org/project/free-transformer/) \u2022 [GitHub](https://github.com/udapy/free-transformer)\n\n</div>\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A Llama-style decoder architecture with explicit latent plans and conditional VAE training",
    "version": "0.1.2",
    "project_urls": {
        "Bug Tracker": "https://github.com/udapy/free-transformer/issues",
        "Changelog": "https://github.com/udapy/free-transformer/blob/main/CHANGELOG.md",
        "Discussions": "https://github.com/udapy/free-transformer/discussions",
        "Documentation": "https://udapy.github.io/free-transformer/",
        "Homepage": "https://github.com/udapy/free-transformer",
        "PyPI": "https://pypi.org/project/free-transformer/",
        "Repository": "https://github.com/udapy/free-transformer"
    },
    "split_keywords": [
        "transformer",
        " vae",
        " deep-learning",
        " pytorch",
        " llm",
        " language-model",
        " latent-planning",
        " conditional-vae",
        " autoregressive",
        " text-generation"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "f23327132645c7c6a101b290df8d20d21e4db15483bce4926346b46a3aac17ab",
                "md5": "7a2d073db354005673cec0449ae3d9f2",
                "sha256": "c9ef5a25db669f99ef48b1a01c8177e5353e8bd927db54f2157b359be9ef0fc9"
            },
            "downloads": -1,
            "filename": "free_transformer-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "7a2d073db354005673cec0449ae3d9f2",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.11",
            "size": 22010,
            "upload_time": "2025-11-02T18:44:18",
            "upload_time_iso_8601": "2025-11-02T18:44:18.715988Z",
            "url": "https://files.pythonhosted.org/packages/f2/33/27132645c7c6a101b290df8d20d21e4db15483bce4926346b46a3aac17ab/free_transformer-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "9b7bc58ac7755a0d05b28b25bd5c518e13ad53f20a8c161a64592591a964e5da",
                "md5": "38a038bb3e660fa805269d03811c0082",
                "sha256": "e574c3b94422852d0787874bc01359987b59b8d7ade45743f3a8fda58c0d2b96"
            },
            "downloads": -1,
            "filename": "free_transformer-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "38a038bb3e660fa805269d03811c0082",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.11",
            "size": 23010,
            "upload_time": "2025-11-02T18:44:20",
            "upload_time_iso_8601": "2025-11-02T18:44:20.092048Z",
            "url": "https://files.pythonhosted.org/packages/9b/7b/c58ac7755a0d05b28b25bd5c518e13ad53f20a8c161a64592591a964e5da/free_transformer-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-11-02 18:44:20",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "udapy",
    "github_project": "free-transformer",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "free-transformer"
}
        
Elapsed time: 4.85892s