spegel


Namespegel JSON
Version 0.1.3 PyPI version JSON
download
home_pageNone
SummarySpegel - Reflect the web through AI. A terminal browser with multiple AI-powered views.
upload_time2025-07-14 10:38:18
maintainerNone
docs_urlNone
authorNone
requires_python>=3.11
licenseNone
keywords ai browser llm terminal textual tui web
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Spegel - Reflect the web through AI

Automatically rewrites the websites into markdown optimised for viewing in the terminal.
Read intro blog post [here](https://simedw.com/2025/06/23/introducing-spegel/)

This is a proof-of-concept, bugs are to be expected but feel free to raise an issue or pull request.

##  Screenshot
Sometimes you don't want to read through someone's life story just to get to a recipe
![Recipe Example](https://simedw.com/2025/06/23/introducing-spegel/images/recipe_example.png)


## Installation

Requires Python 3.11+

```bash
$ pip install spegel
```
or clone the repo and install it in editable mode

```bash
# Clone and enter the directory
$ git clone https://github.com/simedw/spegel.git
$ cd spegel

# Install dependencies and the CLI
$ pip install -e .
```

## API Keys
Spegel is using [litellm](https://github.com/BerriAI/litellm), which allows the use of the  common LLMs, both local and external. 

By default `Gemini 2.5 Flash Lite` is used, which requires you to set the `GEMINI_API_KEY`, see [env_example.txt](/env_example.txt)


## Usage

### Launch the browser

```bash
spegel                # Start with welcome screen
spegel bbc.com        # Open a URL immediately
```

Or, equivalently:

```bash
python -m spegel      # Start with welcome screen
python -m spegel bbc.com
```

### Basic controls
- `/`         – Open URL input
- `Tab`/`Shift+Tab` – Cycle links
- `Enter`     – Open selected link
- `e`         – Edit LLM prompt for current view
- `b`         – Go back
- `q`         – Quit

## Editing settings

Spegel loads settings from a TOML config file. You can customize views, prompts, and UI options.

**Config file search order:**
1. `./.spegel.toml` (current directory)
2. `~/.spegel.toml`
3. `~/.config/spegel/config.toml`

To edit settings:
1. Copy the example config:
   ```bash
   cp example_config.toml .spegel.toml
   # or create ~/.spegel.toml
   ```
2. Edit `.spegel.toml` in your favorite editor.

Example snippet:
```toml
[settings]
default_view = "terminal"
app_title = "Spegel"

[ai]
default_model="gpt-4.1-nano"

[[views]]
id = "raw"
name = "Raw View"
hotkey = "1"
order  = "1"
prompt = ""

[[views]]
id = "terminal"
name = "Terminal"
hotkey = "2"
order = "2"
prompt = "Transform this webpage into the perfect terminal browsing experience! ..."
model="claude-3-5-haiku-20241022"
```

## Local Models with Ollama

To run with a local model using Ollama, first pull and serve your desired model:

```bash
$ ollama pull llama2
$ ollama serve
```
Then set the model in `.spegel.toml` as follows:

```toml
model = "ollama/llama2"
```
Ollama supports models like Llama, Mistral, and many others.

## License
MIT License - see LICENSE file for details.


For more, see the code or open an issue!

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "spegel",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.11",
    "maintainer_email": null,
    "keywords": "ai, browser, llm, terminal, textual, tui, web",
    "author": null,
    "author_email": "Simon Edwardsson <simon.edwardsson@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/42/39/8682d58898e93f2e14ade2439e400bd1b38c8f61d686af7a05d60cf4f7ab/spegel-0.1.3.tar.gz",
    "platform": null,
    "description": "# Spegel - Reflect the web through AI\n\nAutomatically rewrites the websites into markdown optimised for viewing in the terminal.\nRead intro blog post [here](https://simedw.com/2025/06/23/introducing-spegel/)\n\nThis is a proof-of-concept, bugs are to be expected but feel free to raise an issue or pull request.\n\n##  Screenshot\nSometimes you don't want to read through someone's life story just to get to a recipe\n![Recipe Example](https://simedw.com/2025/06/23/introducing-spegel/images/recipe_example.png)\n\n\n## Installation\n\nRequires Python 3.11+\n\n```bash\n$ pip install spegel\n```\nor clone the repo and install it in editable mode\n\n```bash\n# Clone and enter the directory\n$ git clone https://github.com/simedw/spegel.git\n$ cd spegel\n\n# Install dependencies and the CLI\n$ pip install -e .\n```\n\n## API Keys\nSpegel is using [litellm](https://github.com/BerriAI/litellm), which allows the use of the  common LLMs, both local and external. \n\nBy default `Gemini 2.5 Flash Lite` is used, which requires you to set the `GEMINI_API_KEY`, see [env_example.txt](/env_example.txt)\n\n\n## Usage\n\n### Launch the browser\n\n```bash\nspegel                # Start with welcome screen\nspegel bbc.com        # Open a URL immediately\n```\n\nOr, equivalently:\n\n```bash\npython -m spegel      # Start with welcome screen\npython -m spegel bbc.com\n```\n\n### Basic controls\n- `/`         \u2013 Open URL input\n- `Tab`/`Shift+Tab` \u2013 Cycle links\n- `Enter`     \u2013 Open selected link\n- `e`         \u2013 Edit LLM prompt for current view\n- `b`         \u2013 Go back\n- `q`         \u2013 Quit\n\n## Editing settings\n\nSpegel loads settings from a TOML config file. You can customize views, prompts, and UI options.\n\n**Config file search order:**\n1. `./.spegel.toml` (current directory)\n2. `~/.spegel.toml`\n3. `~/.config/spegel/config.toml`\n\nTo edit settings:\n1. Copy the example config:\n   ```bash\n   cp example_config.toml .spegel.toml\n   # or create ~/.spegel.toml\n   ```\n2. Edit `.spegel.toml` in your favorite editor.\n\nExample snippet:\n```toml\n[settings]\ndefault_view = \"terminal\"\napp_title = \"Spegel\"\n\n[ai]\ndefault_model=\"gpt-4.1-nano\"\n\n[[views]]\nid = \"raw\"\nname = \"Raw View\"\nhotkey = \"1\"\norder  = \"1\"\nprompt = \"\"\n\n[[views]]\nid = \"terminal\"\nname = \"Terminal\"\nhotkey = \"2\"\norder = \"2\"\nprompt = \"Transform this webpage into the perfect terminal browsing experience! ...\"\nmodel=\"claude-3-5-haiku-20241022\"\n```\n\n## Local Models with Ollama\n\nTo run with a local model using Ollama, first pull and serve your desired model:\n\n```bash\n$ ollama pull llama2\n$ ollama serve\n```\nThen set the model in `.spegel.toml` as follows:\n\n```toml\nmodel = \"ollama/llama2\"\n```\nOllama supports models like Llama, Mistral, and many others.\n\n## License\nMIT License - see LICENSE file for details.\n\n\nFor more, see the code or open an issue!\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Spegel - Reflect the web through AI. A terminal browser with multiple AI-powered views.",
    "version": "0.1.3",
    "project_urls": {
        "Bug Tracker": "https://github.com/simedw/spegel/issues",
        "Documentation": "https://github.com/simedw/spegel#readme",
        "Homepage": "https://github.com/simedw/spegel",
        "Repository": "https://github.com/simedw/spegel"
    },
    "split_keywords": [
        "ai",
        " browser",
        " llm",
        " terminal",
        " textual",
        " tui",
        " web"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "eea89b3b6ea49bb276fa79f72de83c960b328dd60f8b59168ebdc506c967cf1c",
                "md5": "3d5aa7d0d0aa492451904389e60cac81",
                "sha256": "f8aaf1bff71a1e55a2e6b79399e0b538d9da1294d200047bce8ce50e4ea76067"
            },
            "downloads": -1,
            "filename": "spegel-0.1.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3d5aa7d0d0aa492451904389e60cac81",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.11",
            "size": 22528,
            "upload_time": "2025-07-14T10:38:16",
            "upload_time_iso_8601": "2025-07-14T10:38:16.796371Z",
            "url": "https://files.pythonhosted.org/packages/ee/a8/9b3b6ea49bb276fa79f72de83c960b328dd60f8b59168ebdc506c967cf1c/spegel-0.1.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "42398682d58898e93f2e14ade2439e400bd1b38c8f61d686af7a05d60cf4f7ab",
                "md5": "9e03015531c279e67a04f67fc30167b8",
                "sha256": "976747c7a302723b89fd1de27cc9ae4168a609615c5da13d3ccfa68ff0b379f9"
            },
            "downloads": -1,
            "filename": "spegel-0.1.3.tar.gz",
            "has_sig": false,
            "md5_digest": "9e03015531c279e67a04f67fc30167b8",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.11",
            "size": 130767,
            "upload_time": "2025-07-14T10:38:18",
            "upload_time_iso_8601": "2025-07-14T10:38:18.058494Z",
            "url": "https://files.pythonhosted.org/packages/42/39/8682d58898e93f2e14ade2439e400bd1b38c8f61d686af7a05d60cf4f7ab/spegel-0.1.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-14 10:38:18",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "simedw",
    "github_project": "spegel",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "spegel"
}
        
Elapsed time: 1.24090s