tellaro-query-language


Nametellaro-query-language JSON
Version 0.2.2 PyPI version JSON
download
home_pagehttps://github.com/tellaro/tellaro-query-language
SummaryA flexible, human-friendly query language for searching and filtering structured data
upload_time2025-07-29 16:10:56
maintainerNone
docs_urlNone
authorJustin Henderson
requires_python<3.14,>=3.11
licenseMIT
keywords query language opensearch elasticsearch search filter tql
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Tellaro Query Language

[![PyPI version](https://badge.fury.io/py/tellaro-query-language.svg)](https://badge.fury.io/py/tellaro-query-language)
[![Tests Status](./badges/test-badge.svg?dummy=8484744)](./reports/pytest/junit.xml) [![Coverage Status](./badges/coverage-badge.svg?dummy=8484744)](./reports/coverage/index.html) [![Flake8 Status](./badges/flake8-badge.svg?dummy=8484744)](./reports/flake8/index.html)
[![Python 3.11-3.13](https://img.shields.io/badge/python-3.11%20%7C%203.12%20%7C%203.13-blue)](https://www.python.org/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)

## What is TQL?

Tellaro Query Language (TQL) is a flexible, human-friendly query language for searching and filtering structured data. TQL is designed to provide a unified, readable syntax for expressing complex queries, supporting both simple and advanced search scenarios. It is especially useful for environments where data may come from different backends (such as OpenSearch or JSON files) and where users want to write queries that are portable and easy to understand.

TQL supports:
- **Field selection** (including nested fields)
- **Comparison and logical operators**
- **String, number, and list values**
- **Collection operators** (ANY, ALL) for working with list fields
- **Mutators** for post-processing or transforming field values
- **Operator precedence and parenthetical grouping** (AND, OR, NOT, etc.)
- **Field extraction** for analyzing query dependencies
- **Multiple backends** (in-memory evaluation, OpenSearch, file operations)
- **Statistical aggregations** for data analysis

---

## TQL Syntax Overview

### Basic Query Structure

TQL queries are generally structured as:

```
field [| mutator1 | mutator2 ...] operator value
```

- **field**: The field to query (e.g., `computer.name`, `os.ver`).
- **mutator**: (Optional) One or more transformations to apply to the field before comparison (e.g., `| lowercase`).
- **operator**: The comparison operator (e.g., `eq`, `contains`, `in`, `>`, `regexp`).
- **value**: The value to compare against (string, number, identifier, or list).

#### Example

```
computer.name | lowercase eq 'ha-jhend'
os.ver > 10
os.dataset in ['windows_server', 'enterprise desktop']
```

### Mutators

Mutators allow you to transform field values before comparison. For example, `| lowercase` will convert the field value to lowercase before evaluating the condition.

```
user.email | lowercase eq 'admin@example.com'
```

### Operators

TQL supports a variety of comparison operators, including:

- `eq`, `=`, `ne`, `!=` (equals, not equals)
- `>`, `>=`, `<`, `<=` (greater/less than)
- `contains`, `in`, `regexp`, `startswith`, `endswith`
- `is`, `exists`, `range`, `between`, `cidr`

### Values

Values can be:
- **Strings**: `'value'` or `"value"`
- **Numbers**: `123`, `42`, `1.01`
- **Identifiers**: `computer01`, `admin`
- **Lists**: `["val1", "val2"]`

### Logical Expressions

TQL supports logical operators and grouping:

```
field1 eq 'foo' AND (field2 > 10 OR field3 in ['a', 'b'])
NOT field4 contains 'bar'
```

Operators supported: `AND`, `OR`, `NOT`, `ANY`, `ALL` (case-insensitive)

### Example Query

```
computer.name | lowercase eq 'ha-jhend' AND (os.ver > 10 OR os.dataset in ['windows_server', 'enterprise desktop'])
```

---

## Why TQL Matters

TQL provides a consistent, readable way to express queries across different data sources. It abstracts away backend-specific quirks (like OpenSearch's text vs. keyword fields) and lets users focus on what they want to find, not how to write backend-specific queries.

**Key benefits:**
- **Unified syntax**: Write one query, run it on many backends.
- **Mutators**: Easily transform data inline (e.g., lowercase, trim).
- **Readability**: Queries are easy to read and write, even for complex logic.
- **Extensible**: New operators and mutators can be added as needed.

---

## Example: TQL in Action

Suppose you want to find computers named "HA-JHEND" (case-insensitive), running Windows Server or Enterprise Desktop, and with an OS version greater than 10:

```
computer.name | lowercase eq 'ha-jhend' AND (os.ver > 10 OR os.dataset in ['windows_server', 'enterprise desktop'])
```

This query will:
- Convert `computer.name` to lowercase and compare to `'ha-jhend'`
- Check if `os.ver` is greater than 10
- Check if `os.dataset` is in the provided list

---

## Implementation Notes

TQL is implemented using [pyparsing](https://pyparsing-docs.readthedocs.io/en/latest/) to define the grammar and parse queries. The parser supports mutators, operator precedence, and both standard and reversed operator forms (e.g., `'value' in field`).

See `src/tql/` for the implementation, including the parser grammar and evaluation logic.

## Documentation

For comprehensive documentation, see the [`docs/`](./docs/) folder:

- **[Getting Started](./docs/getting-started.md)** - Learn TQL basics with development examples
- **[Development Guide](./docs/development-guide.md)** - File operations, testing, and common patterns
- **[OpenSearch Integration](./docs/opensearch-integration.md)** - Convert TQL to OpenSearch DSL and Lucene queries
- **[Syntax Reference](./docs/syntax-reference.md)** - Complete grammar and syntax specification  
- **[Operators](./docs/operators.md)** - All comparison and logical operators
- **[Mutators](./docs/mutators.md)** - Field transformation functions (25+ mutators available)
- **[Stats & Aggregations](./docs/stats.md)** - Statistical analysis and data aggregation functions
- **[Examples](./docs/examples.md)** - Real-world query examples for security, DevOps, and business use cases
- **[Best Practices](./docs/best-practices.md)** - Performance optimization and maintainability tips

## Quick Start

### Installation

```bash
# Install from PyPI
pip install tellaro-query-language

# Or install with OpenSearch support
pip install tellaro-query-language[opensearch]
```

### Basic Usage

```python
from tql import TQL

# Initialize TQL
tql = TQL()

# Query data
data = [{'name': 'Alice', 'age': 30}, {'name': 'Bob', 'age': 25}]
results = tql.query(data, 'age > 27')
print(f'Found {len(results)} people over 27: {results}')
# Output: Found 1 people over 27: [{'name': 'Alice', 'age': 30}]
```

For OpenSearch integration examples and production usage patterns, see the [Package Usage Guide](docs/package-usage-guide.md).

### Development Setup

For contributors and developers who want to work on TQL itself:

```bash
# Clone the repository
git clone https://github.com/tellaro/tellaro-query-language.git
cd tellaro-query-language

# Install with poetry (includes all dev dependencies)
poetry install

# Load environment variables for integration tests
cp .env.example .env
# Edit .env with your OpenSearch credentials

# Run tests
poetry run tests
```

**Note**: The development setup uses `python-dotenv` to load OpenSearch credentials from `.env` files for integration testing. This is NOT required when using TQL as a package - see the [Package Usage Guide](docs/package-usage-guide.md) for production configuration patterns.

### TQL Playground

The repository includes an interactive web playground for testing TQL queries:

```bash
# Navigate to the playground directory
cd playground

# Start with Docker (recommended)
docker-compose up

# Or start with OpenSearch included
docker-compose --profile opensearch up
```

Access the playground at:
- Frontend: http://localhost:5173
- API: http://localhost:8000
- API Docs: http://localhost:8000/docs

The playground uses your local TQL source code, so any changes you make are immediately reflected. See [playground/README.md](playground/README.md) for more details.

### File Operations

```python
from tql import TQL

# Query JSON files directly
tql = TQL()
results = tql.query("data.json", "user.role eq 'admin' AND status eq 'active'")

# Query with field mappings for OpenSearch
mappings = {"hostname": "agent.name.keyword"}
tql_mapped = TQL(mappings)
opensearch_dsl = tql_mapped.to_opensearch("hostname eq 'server01'")

# Extract fields from a complex query
query = "process.name eq 'explorer.exe' AND (user.id eq 'admin' OR user.groups contains 'administrators')"
fields = tql.extract_fields(query)
print(fields)  # ['process.name', 'user.groups', 'user.id']
```

### Query Analysis and Health Evaluation

TQL provides context-aware query analysis to help you understand performance implications before execution:

```python
from tql import TQL

tql = TQL()

# Analyze for in-memory execution (default)
query = "field | lowercase | trim eq 'test'"
analysis = tql.analyze_query(query)  # or explicitly: analyze_query(query, context="in_memory")

print(f"Health: {analysis['health']['status']}")  # 'good' - fast mutators don't impact in-memory
print(f"Score: {analysis['health']['score']}")    # 100
print(f"Has mutators: {analysis['stats']['has_mutators']}")  # True

# Analyze the same query for OpenSearch execution
analysis = tql.analyze_query(query, context="opensearch")
print(f"Health: {analysis['health']['status']}")  # 'fair' - post-processing required
print(f"Score: {analysis['health']['score']}")    # 85

# Check mutator-specific health
if 'mutator_health' in analysis:
    print(f"Mutator health: {analysis['mutator_health']['health_status']}")
    for reason in analysis['mutator_health']['health_reasons']:
        print(f"  - {reason['reason']}")

# Slow mutators impact both contexts
slow_query = "hostname | nslookup contains 'example.com'"
analysis = tql.analyze_query(slow_query)
print(f"In-memory health: {analysis['health']['status']}")  # 'fair' or 'poor' - network I/O

# Query complexity analysis
complex_query = "(a > 1 OR b < 2) AND (c = 3 OR (d = 4 AND e = 5))"
analysis = tql.analyze_query(complex_query)
print(f"Depth: {analysis['complexity']['depth']}")
print(f"Fields: {analysis['stats']['fields']}")
print(f"Operators: {analysis['stats']['operators']}")
```

### Post-Processing with OpenSearch

TQL intelligently handles mutators based on field mappings. When OpenSearch can't perform certain operations (like case-insensitive searches on keyword fields), TQL applies post-processing:

```python
# Field mappings with only keyword fields
mappings = {"username": {"type": "keyword"}, "department": {"type": "keyword"}}
tql = TQL(mappings)

# This query requires post-processing since keyword fields can't do case-insensitive contains
query = "username | lowercase contains 'admin' AND department eq 'Engineering'"

# Analyze the query (analyze_opensearch_query is deprecated, use analyze_query instead)
analysis = tql.analyze_query(query, context="opensearch")
print(f"Health: {analysis['health']['status']}")  # 'fair' (post-processing required)

# Execute with automatic post-processing
result = tql.execute_opensearch(
    opensearch_client=client,
    index="users",
    query=query
)
# OpenSearch returns all Engineering users, TQL filters to only those with 'admin' in username

# Run the demo to see this in action
# poetry run python post_processing_demo.py
```

### Development Examples

```bash
# Run comprehensive demos
poetry run python demo.py                          # Basic functionality
poetry run python intelligent_mapping_demo.py      # Field mapping features
poetry run python test_requested_functionality.py  # Core functionality tests
poetry run python field_extraction_demo.py         # Field extraction
poetry run python post_processing_demo.py          # Post-processing filtering

# Run tests
poetry run pytest tests/ -v

# Run integration tests with OpenSearch (requires OpenSearch)
# 1. Copy .env.example to .env and configure connection settings
# 2. Set OPENSEARCH_INTEGRATION_TEST=true in .env
poetry run pytest tests/test_opensearch_integration.py -v
```

## Contributing

TQL supports 25+ mutators including string manipulation, encoding/decoding, DNS operations, and network analysis. See the [Mutators documentation](./docs/mutators.md) for the complete list.

To add new mutators or operators, see the implementation in `src/tql/mutators.py` and `src/tql/parser.py`.

### Statistical Aggregations

TQL supports powerful data analysis with stats expressions:

```tql
# Simple aggregation
| stats sum(revenue)

# Grouped analysis  
| stats count(requests), average(response_time) by server_name

# Top N analysis
| stats sum(sales, top 10) by product_category

# Complex analytics
status eq 'success' 
| stats count(requests), sum(bytes), average(response_time), max(cpu_usage) by endpoint
```

Stats functions include: `sum`, `min`, `max`, `count`, `unique_count`, `average`, `median`, `percentile_rank`, `zscore`, `std`

## Documentation

Comprehensive documentation is available in the [docs](./docs/) directory:

- [**Getting Started**](./docs/getting-started.md) - Quick introduction to TQL
- [**Syntax Reference**](./docs/syntax-reference.md) - Complete syntax guide
- [**Operators**](./docs/operators.md) - All comparison and logical operators
- [**Mutators**](./docs/mutators.md) - Field transformation functions
- [**Mutator Caching & Security**](./docs/mutator-caching.md) - Performance optimization and security controls
- [**OpenSearch Integration**](./docs/opensearch-integration.md) - Using TQL with OpenSearch
- [**Examples**](./docs/examples.md) - Real-world query examples
- [**Architecture**](./docs/architecture.md) - Modular architecture and design
- [**Migration Guide**](./docs/migration-guide.md) - Upgrading from older versions

## Development

### Installation

```bash
# Clone the repository
git clone https://github.com/tellaro/tellaro-query-language.git
cd tellaro-query-language

# Install with poetry
poetry install

# Or install with pip
pip install -e .
```

### Testing

This project supports Python 3.11, 3.12, 3.13, and 3.14. We use `nox` for automated testing across all versions.

```bash
# Install test dependencies
poetry install --with dev

# Run tests on all Python versions
poetry run nox -s tests

# Run tests on a specific version
poetry run nox -s tests-3.12

# Quick test run (fail fast, no coverage)
poetry run nox -s test_quick

# Run linting and formatting
poetry run nox -s lint
poetry run nox -s format

# Run all checks
poetry run nox -s all
```

For more detailed testing instructions, see [TESTING.md](TESTING.md).
            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/tellaro/tellaro-query-language",
    "name": "tellaro-query-language",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.14,>=3.11",
    "maintainer_email": null,
    "keywords": "query, language, opensearch, elasticsearch, search, filter, tql",
    "author": "Justin Henderson",
    "author_email": "justin@tellaro.io",
    "download_url": "https://files.pythonhosted.org/packages/71/c1/4456fb5e7523f856d9d3345a41fe12bd6055a53ca75ca6b4012c94ef21b5/tellaro_query_language-0.2.2.tar.gz",
    "platform": null,
    "description": "# Tellaro Query Language\n\n[![PyPI version](https://badge.fury.io/py/tellaro-query-language.svg)](https://badge.fury.io/py/tellaro-query-language)\n[![Tests Status](./badges/test-badge.svg?dummy=8484744)](./reports/pytest/junit.xml) [![Coverage Status](./badges/coverage-badge.svg?dummy=8484744)](./reports/coverage/index.html) [![Flake8 Status](./badges/flake8-badge.svg?dummy=8484744)](./reports/flake8/index.html)\n[![Python 3.11-3.13](https://img.shields.io/badge/python-3.11%20%7C%203.12%20%7C%203.13-blue)](https://www.python.org/)\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n\n## What is TQL?\n\nTellaro Query Language (TQL) is a flexible, human-friendly query language for searching and filtering structured data. TQL is designed to provide a unified, readable syntax for expressing complex queries, supporting both simple and advanced search scenarios. It is especially useful for environments where data may come from different backends (such as OpenSearch or JSON files) and where users want to write queries that are portable and easy to understand.\n\nTQL supports:\n- **Field selection** (including nested fields)\n- **Comparison and logical operators**\n- **String, number, and list values**\n- **Collection operators** (ANY, ALL) for working with list fields\n- **Mutators** for post-processing or transforming field values\n- **Operator precedence and parenthetical grouping** (AND, OR, NOT, etc.)\n- **Field extraction** for analyzing query dependencies\n- **Multiple backends** (in-memory evaluation, OpenSearch, file operations)\n- **Statistical aggregations** for data analysis\n\n---\n\n## TQL Syntax Overview\n\n### Basic Query Structure\n\nTQL queries are generally structured as:\n\n```\nfield [| mutator1 | mutator2 ...] operator value\n```\n\n- **field**: The field to query (e.g., `computer.name`, `os.ver`).\n- **mutator**: (Optional) One or more transformations to apply to the field before comparison (e.g., `| lowercase`).\n- **operator**: The comparison operator (e.g., `eq`, `contains`, `in`, `>`, `regexp`).\n- **value**: The value to compare against (string, number, identifier, or list).\n\n#### Example\n\n```\ncomputer.name | lowercase eq 'ha-jhend'\nos.ver > 10\nos.dataset in ['windows_server', 'enterprise desktop']\n```\n\n### Mutators\n\nMutators allow you to transform field values before comparison. For example, `| lowercase` will convert the field value to lowercase before evaluating the condition.\n\n```\nuser.email | lowercase eq 'admin@example.com'\n```\n\n### Operators\n\nTQL supports a variety of comparison operators, including:\n\n- `eq`, `=`, `ne`, `!=` (equals, not equals)\n- `>`, `>=`, `<`, `<=` (greater/less than)\n- `contains`, `in`, `regexp`, `startswith`, `endswith`\n- `is`, `exists`, `range`, `between`, `cidr`\n\n### Values\n\nValues can be:\n- **Strings**: `'value'` or `\"value\"`\n- **Numbers**: `123`, `42`, `1.01`\n- **Identifiers**: `computer01`, `admin`\n- **Lists**: `[\"val1\", \"val2\"]`\n\n### Logical Expressions\n\nTQL supports logical operators and grouping:\n\n```\nfield1 eq 'foo' AND (field2 > 10 OR field3 in ['a', 'b'])\nNOT field4 contains 'bar'\n```\n\nOperators supported: `AND`, `OR`, `NOT`, `ANY`, `ALL` (case-insensitive)\n\n### Example Query\n\n```\ncomputer.name | lowercase eq 'ha-jhend' AND (os.ver > 10 OR os.dataset in ['windows_server', 'enterprise desktop'])\n```\n\n---\n\n## Why TQL Matters\n\nTQL provides a consistent, readable way to express queries across different data sources. It abstracts away backend-specific quirks (like OpenSearch's text vs. keyword fields) and lets users focus on what they want to find, not how to write backend-specific queries.\n\n**Key benefits:**\n- **Unified syntax**: Write one query, run it on many backends.\n- **Mutators**: Easily transform data inline (e.g., lowercase, trim).\n- **Readability**: Queries are easy to read and write, even for complex logic.\n- **Extensible**: New operators and mutators can be added as needed.\n\n---\n\n## Example: TQL in Action\n\nSuppose you want to find computers named \"HA-JHEND\" (case-insensitive), running Windows Server or Enterprise Desktop, and with an OS version greater than 10:\n\n```\ncomputer.name | lowercase eq 'ha-jhend' AND (os.ver > 10 OR os.dataset in ['windows_server', 'enterprise desktop'])\n```\n\nThis query will:\n- Convert `computer.name` to lowercase and compare to `'ha-jhend'`\n- Check if `os.ver` is greater than 10\n- Check if `os.dataset` is in the provided list\n\n---\n\n## Implementation Notes\n\nTQL is implemented using [pyparsing](https://pyparsing-docs.readthedocs.io/en/latest/) to define the grammar and parse queries. The parser supports mutators, operator precedence, and both standard and reversed operator forms (e.g., `'value' in field`).\n\nSee `src/tql/` for the implementation, including the parser grammar and evaluation logic.\n\n## Documentation\n\nFor comprehensive documentation, see the [`docs/`](./docs/) folder:\n\n- **[Getting Started](./docs/getting-started.md)** - Learn TQL basics with development examples\n- **[Development Guide](./docs/development-guide.md)** - File operations, testing, and common patterns\n- **[OpenSearch Integration](./docs/opensearch-integration.md)** - Convert TQL to OpenSearch DSL and Lucene queries\n- **[Syntax Reference](./docs/syntax-reference.md)** - Complete grammar and syntax specification  \n- **[Operators](./docs/operators.md)** - All comparison and logical operators\n- **[Mutators](./docs/mutators.md)** - Field transformation functions (25+ mutators available)\n- **[Stats & Aggregations](./docs/stats.md)** - Statistical analysis and data aggregation functions\n- **[Examples](./docs/examples.md)** - Real-world query examples for security, DevOps, and business use cases\n- **[Best Practices](./docs/best-practices.md)** - Performance optimization and maintainability tips\n\n## Quick Start\n\n### Installation\n\n```bash\n# Install from PyPI\npip install tellaro-query-language\n\n# Or install with OpenSearch support\npip install tellaro-query-language[opensearch]\n```\n\n### Basic Usage\n\n```python\nfrom tql import TQL\n\n# Initialize TQL\ntql = TQL()\n\n# Query data\ndata = [{'name': 'Alice', 'age': 30}, {'name': 'Bob', 'age': 25}]\nresults = tql.query(data, 'age > 27')\nprint(f'Found {len(results)} people over 27: {results}')\n# Output: Found 1 people over 27: [{'name': 'Alice', 'age': 30}]\n```\n\nFor OpenSearch integration examples and production usage patterns, see the [Package Usage Guide](docs/package-usage-guide.md).\n\n### Development Setup\n\nFor contributors and developers who want to work on TQL itself:\n\n```bash\n# Clone the repository\ngit clone https://github.com/tellaro/tellaro-query-language.git\ncd tellaro-query-language\n\n# Install with poetry (includes all dev dependencies)\npoetry install\n\n# Load environment variables for integration tests\ncp .env.example .env\n# Edit .env with your OpenSearch credentials\n\n# Run tests\npoetry run tests\n```\n\n**Note**: The development setup uses `python-dotenv` to load OpenSearch credentials from `.env` files for integration testing. This is NOT required when using TQL as a package - see the [Package Usage Guide](docs/package-usage-guide.md) for production configuration patterns.\n\n### TQL Playground\n\nThe repository includes an interactive web playground for testing TQL queries:\n\n```bash\n# Navigate to the playground directory\ncd playground\n\n# Start with Docker (recommended)\ndocker-compose up\n\n# Or start with OpenSearch included\ndocker-compose --profile opensearch up\n```\n\nAccess the playground at:\n- Frontend: http://localhost:5173\n- API: http://localhost:8000\n- API Docs: http://localhost:8000/docs\n\nThe playground uses your local TQL source code, so any changes you make are immediately reflected. See [playground/README.md](playground/README.md) for more details.\n\n### File Operations\n\n```python\nfrom tql import TQL\n\n# Query JSON files directly\ntql = TQL()\nresults = tql.query(\"data.json\", \"user.role eq 'admin' AND status eq 'active'\")\n\n# Query with field mappings for OpenSearch\nmappings = {\"hostname\": \"agent.name.keyword\"}\ntql_mapped = TQL(mappings)\nopensearch_dsl = tql_mapped.to_opensearch(\"hostname eq 'server01'\")\n\n# Extract fields from a complex query\nquery = \"process.name eq 'explorer.exe' AND (user.id eq 'admin' OR user.groups contains 'administrators')\"\nfields = tql.extract_fields(query)\nprint(fields)  # ['process.name', 'user.groups', 'user.id']\n```\n\n### Query Analysis and Health Evaluation\n\nTQL provides context-aware query analysis to help you understand performance implications before execution:\n\n```python\nfrom tql import TQL\n\ntql = TQL()\n\n# Analyze for in-memory execution (default)\nquery = \"field | lowercase | trim eq 'test'\"\nanalysis = tql.analyze_query(query)  # or explicitly: analyze_query(query, context=\"in_memory\")\n\nprint(f\"Health: {analysis['health']['status']}\")  # 'good' - fast mutators don't impact in-memory\nprint(f\"Score: {analysis['health']['score']}\")    # 100\nprint(f\"Has mutators: {analysis['stats']['has_mutators']}\")  # True\n\n# Analyze the same query for OpenSearch execution\nanalysis = tql.analyze_query(query, context=\"opensearch\")\nprint(f\"Health: {analysis['health']['status']}\")  # 'fair' - post-processing required\nprint(f\"Score: {analysis['health']['score']}\")    # 85\n\n# Check mutator-specific health\nif 'mutator_health' in analysis:\n    print(f\"Mutator health: {analysis['mutator_health']['health_status']}\")\n    for reason in analysis['mutator_health']['health_reasons']:\n        print(f\"  - {reason['reason']}\")\n\n# Slow mutators impact both contexts\nslow_query = \"hostname | nslookup contains 'example.com'\"\nanalysis = tql.analyze_query(slow_query)\nprint(f\"In-memory health: {analysis['health']['status']}\")  # 'fair' or 'poor' - network I/O\n\n# Query complexity analysis\ncomplex_query = \"(a > 1 OR b < 2) AND (c = 3 OR (d = 4 AND e = 5))\"\nanalysis = tql.analyze_query(complex_query)\nprint(f\"Depth: {analysis['complexity']['depth']}\")\nprint(f\"Fields: {analysis['stats']['fields']}\")\nprint(f\"Operators: {analysis['stats']['operators']}\")\n```\n\n### Post-Processing with OpenSearch\n\nTQL intelligently handles mutators based on field mappings. When OpenSearch can't perform certain operations (like case-insensitive searches on keyword fields), TQL applies post-processing:\n\n```python\n# Field mappings with only keyword fields\nmappings = {\"username\": {\"type\": \"keyword\"}, \"department\": {\"type\": \"keyword\"}}\ntql = TQL(mappings)\n\n# This query requires post-processing since keyword fields can't do case-insensitive contains\nquery = \"username | lowercase contains 'admin' AND department eq 'Engineering'\"\n\n# Analyze the query (analyze_opensearch_query is deprecated, use analyze_query instead)\nanalysis = tql.analyze_query(query, context=\"opensearch\")\nprint(f\"Health: {analysis['health']['status']}\")  # 'fair' (post-processing required)\n\n# Execute with automatic post-processing\nresult = tql.execute_opensearch(\n    opensearch_client=client,\n    index=\"users\",\n    query=query\n)\n# OpenSearch returns all Engineering users, TQL filters to only those with 'admin' in username\n\n# Run the demo to see this in action\n# poetry run python post_processing_demo.py\n```\n\n### Development Examples\n\n```bash\n# Run comprehensive demos\npoetry run python demo.py                          # Basic functionality\npoetry run python intelligent_mapping_demo.py      # Field mapping features\npoetry run python test_requested_functionality.py  # Core functionality tests\npoetry run python field_extraction_demo.py         # Field extraction\npoetry run python post_processing_demo.py          # Post-processing filtering\n\n# Run tests\npoetry run pytest tests/ -v\n\n# Run integration tests with OpenSearch (requires OpenSearch)\n# 1. Copy .env.example to .env and configure connection settings\n# 2. Set OPENSEARCH_INTEGRATION_TEST=true in .env\npoetry run pytest tests/test_opensearch_integration.py -v\n```\n\n## Contributing\n\nTQL supports 25+ mutators including string manipulation, encoding/decoding, DNS operations, and network analysis. See the [Mutators documentation](./docs/mutators.md) for the complete list.\n\nTo add new mutators or operators, see the implementation in `src/tql/mutators.py` and `src/tql/parser.py`.\n\n### Statistical Aggregations\n\nTQL supports powerful data analysis with stats expressions:\n\n```tql\n# Simple aggregation\n| stats sum(revenue)\n\n# Grouped analysis  \n| stats count(requests), average(response_time) by server_name\n\n# Top N analysis\n| stats sum(sales, top 10) by product_category\n\n# Complex analytics\nstatus eq 'success' \n| stats count(requests), sum(bytes), average(response_time), max(cpu_usage) by endpoint\n```\n\nStats functions include: `sum`, `min`, `max`, `count`, `unique_count`, `average`, `median`, `percentile_rank`, `zscore`, `std`\n\n## Documentation\n\nComprehensive documentation is available in the [docs](./docs/) directory:\n\n- [**Getting Started**](./docs/getting-started.md) - Quick introduction to TQL\n- [**Syntax Reference**](./docs/syntax-reference.md) - Complete syntax guide\n- [**Operators**](./docs/operators.md) - All comparison and logical operators\n- [**Mutators**](./docs/mutators.md) - Field transformation functions\n- [**Mutator Caching & Security**](./docs/mutator-caching.md) - Performance optimization and security controls\n- [**OpenSearch Integration**](./docs/opensearch-integration.md) - Using TQL with OpenSearch\n- [**Examples**](./docs/examples.md) - Real-world query examples\n- [**Architecture**](./docs/architecture.md) - Modular architecture and design\n- [**Migration Guide**](./docs/migration-guide.md) - Upgrading from older versions\n\n## Development\n\n### Installation\n\n```bash\n# Clone the repository\ngit clone https://github.com/tellaro/tellaro-query-language.git\ncd tellaro-query-language\n\n# Install with poetry\npoetry install\n\n# Or install with pip\npip install -e .\n```\n\n### Testing\n\nThis project supports Python 3.11, 3.12, 3.13, and 3.14. We use `nox` for automated testing across all versions.\n\n```bash\n# Install test dependencies\npoetry install --with dev\n\n# Run tests on all Python versions\npoetry run nox -s tests\n\n# Run tests on a specific version\npoetry run nox -s tests-3.12\n\n# Quick test run (fail fast, no coverage)\npoetry run nox -s test_quick\n\n# Run linting and formatting\npoetry run nox -s lint\npoetry run nox -s format\n\n# Run all checks\npoetry run nox -s all\n```\n\nFor more detailed testing instructions, see [TESTING.md](TESTING.md).",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A flexible, human-friendly query language for searching and filtering structured data",
    "version": "0.2.2",
    "project_urls": {
        "Documentation": "https://github.com/tellaro/tellaro-query-language/tree/main/docs",
        "Homepage": "https://github.com/tellaro/tellaro-query-language",
        "Repository": "https://github.com/tellaro/tellaro-query-language"
    },
    "split_keywords": [
        "query",
        " language",
        " opensearch",
        " elasticsearch",
        " search",
        " filter",
        " tql"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5b93244e13f2e744431d768817ee659d72bfa71e32d0f26f39894b8e643839c2",
                "md5": "9a5b95ca6c883a416209e2f51254f332",
                "sha256": "80a5c4058fce7195cab69df54c4daedb399ff598f483315029749e4a6b00752d"
            },
            "downloads": -1,
            "filename": "tellaro_query_language-0.2.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9a5b95ca6c883a416209e2f51254f332",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.14,>=3.11",
            "size": 158409,
            "upload_time": "2025-07-29T16:10:55",
            "upload_time_iso_8601": "2025-07-29T16:10:55.475616Z",
            "url": "https://files.pythonhosted.org/packages/5b/93/244e13f2e744431d768817ee659d72bfa71e32d0f26f39894b8e643839c2/tellaro_query_language-0.2.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "71c14456fb5e7523f856d9d3345a41fe12bd6055a53ca75ca6b4012c94ef21b5",
                "md5": "492dc2af29c396caf56c2b6dbb799f36",
                "sha256": "78901c4ce1cfc75b894196482e9b29d1d94f248d8d8a0c306488166638bdc380"
            },
            "downloads": -1,
            "filename": "tellaro_query_language-0.2.2.tar.gz",
            "has_sig": false,
            "md5_digest": "492dc2af29c396caf56c2b6dbb799f36",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.14,>=3.11",
            "size": 139137,
            "upload_time": "2025-07-29T16:10:56",
            "upload_time_iso_8601": "2025-07-29T16:10:56.564139Z",
            "url": "https://files.pythonhosted.org/packages/71/c1/4456fb5e7523f856d9d3345a41fe12bd6055a53ca75ca6b4012c94ef21b5/tellaro_query_language-0.2.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-29 16:10:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "tellaro",
    "github_project": "tellaro-query-language",
    "github_not_found": true,
    "lcname": "tellaro-query-language"
}
        
Elapsed time: 0.58170s