hacs-tools


Namehacs-tools JSON
Version 0.2.3 PyPI version JSON
download
home_pageNone
SummaryCore tools and utilities for HACS (Healthcare Agent Communication Standard)
upload_time2025-07-09 16:24:16
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseApache-2.0
keywords crud evidence healthcare memory search tools validation
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # HACS Tools

Core tools and utilities for Healthcare Agent Communication Standard (HACS).

## Overview

`hacs-tools` provides essential tools for working with HACS data, including CRUD operations, search functionality, validation, memory management, evidence handling, and structured data processing. This package focuses on core functionality that is shared across all HACS implementations.

## 📦 **Modular Architecture**

Protocol adapters and integrations are available in separate packages:

- **[hacs-langgraph](https://pypi.org/project/hacs-langgraph/)**: LangGraph workflow integration
- **[hacs-crewai](https://pypi.org/project/hacs-crewai/)**: CrewAI multi-agent workflows  
- **[hacs-autogen](https://pypi.org/project/hacs-autogen/)**: AutoGen UI integration
- **[hacs-anthropic](https://pypi.org/project/hacs-anthropic/)**: Anthropic AI integration
- **[hacs-openai](https://pypi.org/project/hacs-openai/)**: OpenAI integration
- **[hacs-pinecone](https://pypi.org/project/hacs-pinecone/)**: Pinecone vector store
- **[hacs-qdrant](https://pypi.org/project/hacs-qdrant/)**: Qdrant vector store

Install only the tools and adapters you need:
```bash
pip install hacs-tools                     # Core tools only
pip install hacs-tools hacs-langgraph      # Core tools + LangGraph
pip install hacs-tools hacs-crewai         # Core tools + CrewAI
pip install hacs-tools hacs-openai hacs-pinecone  # Core + OpenAI + Pinecone
```

## Key Components

### CRUD Operations
- Create, Read, Update, Delete operations for all HACS models
- Bulk operations for efficient data processing
- Transaction support and rollback capabilities
- Data validation and integrity checks
- Storage backend abstraction (memory, file, database)

### Search and Retrieval
- Semantic search capabilities with hybrid scoring
- Structured query interface with FHIR parameter support
- Full-text search with medical terminology
- Faceted search and filtering
- Resource-specific search optimizations

### Memory Management
- Persistent memory storage and retrieval
- Memory consolidation and lifecycle management
- Cross-resource memory linking
- Importance-based memory prioritization
- Memory search and querying

### Evidence Management
- Clinical evidence storage and retrieval
- Evidence quality assessment and scoring
- Evidence linking and relationship management
- Citation and reference handling
- Evidence-based decision support

### Validation and Quality Assurance
- Comprehensive data validation with Actor context
- Schema validation and business rule enforcement
- Cross-reference validation and integrity checks
- FHIR compliance validation
- Clinical data quality assessment

### Structured Data Processing
- LLM function specification generation
- Structured output validation and coercion
- Multi-provider tool calling patterns
- Healthcare-specific data transformation
- Clinical data normalization

### Base Vectorization Interfaces
- Abstract embedding model protocols
- Vector store interface definitions
- Metadata management for vectors
- Base vectorization utilities

## Installation

```bash
pip install hacs-tools
```

## Quick Start

```python
from hacs_tools import (
    CreateResource, ReadResource, UpdateResource, DeleteResource,
    StorageManager, PermissionManager, DataValidator
)
from hacs_models import Patient, Observation

# CRUD operations
storage = StorageManager()
permissions = PermissionManager()

# Create a patient
create_op = CreateResource()
patient = Patient(full_name="Alice Johnson", age=35)
patient_id = create_op.execute(patient, actor=doctor)

# Read patient
read_op = ReadResource()
retrieved_patient = read_op.execute(Patient, patient_id, actor=doctor)

# Update patient
update_op = UpdateResource()
retrieved_patient.full_name = "Alice Johnson-Smith"
update_op.execute(retrieved_patient, actor=doctor)

# Validation
validator = DataValidator()
is_valid = validator.validate_patient(patient, actor=doctor)
```

## Core Tools

### CRUD Operations
```python
from hacs_tools import CreatePatient, ReadPatient, CreateObservation

# Specialized CRUD operations
create_patient = CreatePatient()
patient = create_patient.execute(patient_data, actor=doctor)

read_patient = ReadPatient()
patient = read_patient.execute(patient_id, actor=nurse)

create_obs = CreateObservation()
observation = create_obs.execute(obs_data, actor=doctor)
```

### Search and Retrieval
```python
from hacs_tools.search import SemanticSearch, FHIRSearch

# Semantic search
search = SemanticSearch()
results = search.find_patients(query="hypertension", actor=doctor, limit=10)

# FHIR parameter search
fhir_search = FHIRSearch()
results = fhir_search.search_observations(
    patient_id="patient-123",
    code="blood-pressure",
    actor=doctor
)
```

### Memory Management
```python
from hacs_tools.memory import MemoryManager

memory = MemoryManager()
memory.store_memory(memory_block, actor=doctor)
memories = memory.retrieve_memories(patient_id, actor=doctor)
consolidated = memory.consolidate_memories(workflow_id, actor=doctor)
```

### Evidence Management
```python
from hacs_tools.evidence import EvidenceManager

evidence_mgr = EvidenceManager()
evidence = evidence_mgr.create_evidence(content, evidence_type, actor=researcher)
evidence_mgr.link_to_patient(evidence, patient_id, actor=doctor)
results = evidence_mgr.search_evidence(query="ACE inhibitors", actor=doctor)
```

### Validation
```python
from hacs_tools.validation import DataValidator

validator = DataValidator()
is_valid = validator.validate_patient(patient_data, actor=doctor)
errors = validator.get_validation_errors(patient_data)
fhir_valid = validator.validate_fhir_compliance(resource, actor=doctor)
```

### Structured Data Processing
```python
from hacs_tools.structured import StructuredProcessor, generate_function_spec

processor = StructuredProcessor()
structured_data = processor.process_clinical_text(text, actor=doctor)

# Generate LLM function specs
spec = generate_function_spec(Patient, pattern="openai")
```

### Vectorization Base Classes
```python
from hacs_tools.vectorization import HACSVectorizer, VectorMetadata

# Base vectorization (implement with specific vector stores)
vectorizer = HACSVectorizer()
metadata = VectorMetadata(
    resource_type="Patient",
    resource_id="patient-123",
    content_hash="abc123"
)
```

## Advanced Features

### Bulk Operations
```python
from hacs_tools import CreateResource

# Bulk create patients
patients = [Patient(full_name=f"Patient {i}", age=30+i) for i in range(100)]
bulk_create = CreateResource()
patient_ids = bulk_create.bulk_execute(patients, actor=admin)
```

### Storage Backend Configuration
```python
from hacs_tools import set_storage_backend, StorageBackend

# Configure storage backend
set_storage_backend(StorageBackend.DATABASE)
storage = get_storage_manager()
```

### Memory Consolidation
```python
from hacs_tools.memory import MemoryManager

memory = MemoryManager()
# Add consolidation rule
memory.add_consolidation_rule(
    lambda memories: sorted(memories, key=lambda m: m.importance_score, reverse=True)[:10]
)
consolidated = memory.consolidate_memories("workflow-123", actor=doctor)
```

### Evidence Quality Assessment
```python
from hacs_tools.evidence import EvidenceManager

evidence_mgr = EvidenceManager()
quality_score = evidence_mgr.assess_evidence_quality(evidence, actor=researcher)
evidence_mgr.update_quality_score(evidence_id, quality_score, actor=researcher)
```

### Hybrid Search
```python
from hacs_tools.search import SemanticSearch

search = SemanticSearch()
results = search.hybrid_search(
    query="cardiovascular risk factors",
    resource_types=["Patient", "Observation"],
    actor=doctor,
    method="hybrid"  # combines semantic + text search
)
```

## Integration Examples

### With LangGraph
```python
# Install: pip install hacs-tools hacs-langgraph
from hacs_tools import CreateResource, MemoryManager
from hacs_langgraph import LangGraphAdapter

# Core tools work seamlessly with LangGraph
adapter = LangGraphAdapter()
memory_mgr = MemoryManager()
create_resource = CreateResource()

# Use in LangGraph workflows
state = adapter.create_hacs_state(workflow_type="clinical_assessment", actor=doctor)
```

### With CrewAI
```python
# Install: pip install hacs-tools hacs-crewai
from hacs_tools import EvidenceManager, DataValidator
from hacs_crewai import CrewAIAdapter

# Core tools work seamlessly with CrewAI
evidence_mgr = EvidenceManager()
validator = DataValidator()
adapter = CrewAIAdapter()

# Use in CrewAI workflows
task = adapter.create_evidence_synthesis_task(evidence_list, query, actor)
```

## Documentation

For complete documentation, see the [HACS Documentation](https://github.com/solanovisitor/hacs/blob/main/docs/README.md).

## Related Packages

- **[hacs-core](https://pypi.org/project/hacs-core/)**: Core HACS data models and utilities
- **[hacs-models](https://pypi.org/project/hacs-models/)**: Clinical data models (Patient, Observation, etc.)
- **[hacs-langgraph](https://pypi.org/project/hacs-langgraph/)**: LangGraph workflow integration
- **[hacs-crewai](https://pypi.org/project/hacs-crewai/)**: CrewAI multi-agent workflows
- **[hacs-anthropic](https://pypi.org/project/hacs-anthropic/)**: Anthropic AI integration
- **[hacs-openai](https://pypi.org/project/hacs-openai/)**: OpenAI integration

## License

Licensed under the Apache License, Version 2.0. See [LICENSE](https://github.com/solanovisitor/hacs/blob/main/LICENSE) for details.

## Contributing

See [Contributing Guidelines](https://github.com/solanovisitor/hacs/blob/main/docs/contributing/guidelines.md) for information on how to contribute to HACS Tools.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "hacs-tools",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "Solano Todeschini <solano.todeschini@gmail.com>",
    "keywords": "crud, evidence, healthcare, memory, search, tools, validation",
    "author": null,
    "author_email": "Solano Todeschini <solano.todeschini@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/f5/79/749b091b9204bc097cd102c0281db229b8da08bb3b4dd13360391a6446aa/hacs_tools-0.2.3.tar.gz",
    "platform": null,
    "description": "# HACS Tools\n\nCore tools and utilities for Healthcare Agent Communication Standard (HACS).\n\n## Overview\n\n`hacs-tools` provides essential tools for working with HACS data, including CRUD operations, search functionality, validation, memory management, evidence handling, and structured data processing. This package focuses on core functionality that is shared across all HACS implementations.\n\n## \ud83d\udce6 **Modular Architecture**\n\nProtocol adapters and integrations are available in separate packages:\n\n- **[hacs-langgraph](https://pypi.org/project/hacs-langgraph/)**: LangGraph workflow integration\n- **[hacs-crewai](https://pypi.org/project/hacs-crewai/)**: CrewAI multi-agent workflows  \n- **[hacs-autogen](https://pypi.org/project/hacs-autogen/)**: AutoGen UI integration\n- **[hacs-anthropic](https://pypi.org/project/hacs-anthropic/)**: Anthropic AI integration\n- **[hacs-openai](https://pypi.org/project/hacs-openai/)**: OpenAI integration\n- **[hacs-pinecone](https://pypi.org/project/hacs-pinecone/)**: Pinecone vector store\n- **[hacs-qdrant](https://pypi.org/project/hacs-qdrant/)**: Qdrant vector store\n\nInstall only the tools and adapters you need:\n```bash\npip install hacs-tools                     # Core tools only\npip install hacs-tools hacs-langgraph      # Core tools + LangGraph\npip install hacs-tools hacs-crewai         # Core tools + CrewAI\npip install hacs-tools hacs-openai hacs-pinecone  # Core + OpenAI + Pinecone\n```\n\n## Key Components\n\n### CRUD Operations\n- Create, Read, Update, Delete operations for all HACS models\n- Bulk operations for efficient data processing\n- Transaction support and rollback capabilities\n- Data validation and integrity checks\n- Storage backend abstraction (memory, file, database)\n\n### Search and Retrieval\n- Semantic search capabilities with hybrid scoring\n- Structured query interface with FHIR parameter support\n- Full-text search with medical terminology\n- Faceted search and filtering\n- Resource-specific search optimizations\n\n### Memory Management\n- Persistent memory storage and retrieval\n- Memory consolidation and lifecycle management\n- Cross-resource memory linking\n- Importance-based memory prioritization\n- Memory search and querying\n\n### Evidence Management\n- Clinical evidence storage and retrieval\n- Evidence quality assessment and scoring\n- Evidence linking and relationship management\n- Citation and reference handling\n- Evidence-based decision support\n\n### Validation and Quality Assurance\n- Comprehensive data validation with Actor context\n- Schema validation and business rule enforcement\n- Cross-reference validation and integrity checks\n- FHIR compliance validation\n- Clinical data quality assessment\n\n### Structured Data Processing\n- LLM function specification generation\n- Structured output validation and coercion\n- Multi-provider tool calling patterns\n- Healthcare-specific data transformation\n- Clinical data normalization\n\n### Base Vectorization Interfaces\n- Abstract embedding model protocols\n- Vector store interface definitions\n- Metadata management for vectors\n- Base vectorization utilities\n\n## Installation\n\n```bash\npip install hacs-tools\n```\n\n## Quick Start\n\n```python\nfrom hacs_tools import (\n    CreateResource, ReadResource, UpdateResource, DeleteResource,\n    StorageManager, PermissionManager, DataValidator\n)\nfrom hacs_models import Patient, Observation\n\n# CRUD operations\nstorage = StorageManager()\npermissions = PermissionManager()\n\n# Create a patient\ncreate_op = CreateResource()\npatient = Patient(full_name=\"Alice Johnson\", age=35)\npatient_id = create_op.execute(patient, actor=doctor)\n\n# Read patient\nread_op = ReadResource()\nretrieved_patient = read_op.execute(Patient, patient_id, actor=doctor)\n\n# Update patient\nupdate_op = UpdateResource()\nretrieved_patient.full_name = \"Alice Johnson-Smith\"\nupdate_op.execute(retrieved_patient, actor=doctor)\n\n# Validation\nvalidator = DataValidator()\nis_valid = validator.validate_patient(patient, actor=doctor)\n```\n\n## Core Tools\n\n### CRUD Operations\n```python\nfrom hacs_tools import CreatePatient, ReadPatient, CreateObservation\n\n# Specialized CRUD operations\ncreate_patient = CreatePatient()\npatient = create_patient.execute(patient_data, actor=doctor)\n\nread_patient = ReadPatient()\npatient = read_patient.execute(patient_id, actor=nurse)\n\ncreate_obs = CreateObservation()\nobservation = create_obs.execute(obs_data, actor=doctor)\n```\n\n### Search and Retrieval\n```python\nfrom hacs_tools.search import SemanticSearch, FHIRSearch\n\n# Semantic search\nsearch = SemanticSearch()\nresults = search.find_patients(query=\"hypertension\", actor=doctor, limit=10)\n\n# FHIR parameter search\nfhir_search = FHIRSearch()\nresults = fhir_search.search_observations(\n    patient_id=\"patient-123\",\n    code=\"blood-pressure\",\n    actor=doctor\n)\n```\n\n### Memory Management\n```python\nfrom hacs_tools.memory import MemoryManager\n\nmemory = MemoryManager()\nmemory.store_memory(memory_block, actor=doctor)\nmemories = memory.retrieve_memories(patient_id, actor=doctor)\nconsolidated = memory.consolidate_memories(workflow_id, actor=doctor)\n```\n\n### Evidence Management\n```python\nfrom hacs_tools.evidence import EvidenceManager\n\nevidence_mgr = EvidenceManager()\nevidence = evidence_mgr.create_evidence(content, evidence_type, actor=researcher)\nevidence_mgr.link_to_patient(evidence, patient_id, actor=doctor)\nresults = evidence_mgr.search_evidence(query=\"ACE inhibitors\", actor=doctor)\n```\n\n### Validation\n```python\nfrom hacs_tools.validation import DataValidator\n\nvalidator = DataValidator()\nis_valid = validator.validate_patient(patient_data, actor=doctor)\nerrors = validator.get_validation_errors(patient_data)\nfhir_valid = validator.validate_fhir_compliance(resource, actor=doctor)\n```\n\n### Structured Data Processing\n```python\nfrom hacs_tools.structured import StructuredProcessor, generate_function_spec\n\nprocessor = StructuredProcessor()\nstructured_data = processor.process_clinical_text(text, actor=doctor)\n\n# Generate LLM function specs\nspec = generate_function_spec(Patient, pattern=\"openai\")\n```\n\n### Vectorization Base Classes\n```python\nfrom hacs_tools.vectorization import HACSVectorizer, VectorMetadata\n\n# Base vectorization (implement with specific vector stores)\nvectorizer = HACSVectorizer()\nmetadata = VectorMetadata(\n    resource_type=\"Patient\",\n    resource_id=\"patient-123\",\n    content_hash=\"abc123\"\n)\n```\n\n## Advanced Features\n\n### Bulk Operations\n```python\nfrom hacs_tools import CreateResource\n\n# Bulk create patients\npatients = [Patient(full_name=f\"Patient {i}\", age=30+i) for i in range(100)]\nbulk_create = CreateResource()\npatient_ids = bulk_create.bulk_execute(patients, actor=admin)\n```\n\n### Storage Backend Configuration\n```python\nfrom hacs_tools import set_storage_backend, StorageBackend\n\n# Configure storage backend\nset_storage_backend(StorageBackend.DATABASE)\nstorage = get_storage_manager()\n```\n\n### Memory Consolidation\n```python\nfrom hacs_tools.memory import MemoryManager\n\nmemory = MemoryManager()\n# Add consolidation rule\nmemory.add_consolidation_rule(\n    lambda memories: sorted(memories, key=lambda m: m.importance_score, reverse=True)[:10]\n)\nconsolidated = memory.consolidate_memories(\"workflow-123\", actor=doctor)\n```\n\n### Evidence Quality Assessment\n```python\nfrom hacs_tools.evidence import EvidenceManager\n\nevidence_mgr = EvidenceManager()\nquality_score = evidence_mgr.assess_evidence_quality(evidence, actor=researcher)\nevidence_mgr.update_quality_score(evidence_id, quality_score, actor=researcher)\n```\n\n### Hybrid Search\n```python\nfrom hacs_tools.search import SemanticSearch\n\nsearch = SemanticSearch()\nresults = search.hybrid_search(\n    query=\"cardiovascular risk factors\",\n    resource_types=[\"Patient\", \"Observation\"],\n    actor=doctor,\n    method=\"hybrid\"  # combines semantic + text search\n)\n```\n\n## Integration Examples\n\n### With LangGraph\n```python\n# Install: pip install hacs-tools hacs-langgraph\nfrom hacs_tools import CreateResource, MemoryManager\nfrom hacs_langgraph import LangGraphAdapter\n\n# Core tools work seamlessly with LangGraph\nadapter = LangGraphAdapter()\nmemory_mgr = MemoryManager()\ncreate_resource = CreateResource()\n\n# Use in LangGraph workflows\nstate = adapter.create_hacs_state(workflow_type=\"clinical_assessment\", actor=doctor)\n```\n\n### With CrewAI\n```python\n# Install: pip install hacs-tools hacs-crewai\nfrom hacs_tools import EvidenceManager, DataValidator\nfrom hacs_crewai import CrewAIAdapter\n\n# Core tools work seamlessly with CrewAI\nevidence_mgr = EvidenceManager()\nvalidator = DataValidator()\nadapter = CrewAIAdapter()\n\n# Use in CrewAI workflows\ntask = adapter.create_evidence_synthesis_task(evidence_list, query, actor)\n```\n\n## Documentation\n\nFor complete documentation, see the [HACS Documentation](https://github.com/solanovisitor/hacs/blob/main/docs/README.md).\n\n## Related Packages\n\n- **[hacs-core](https://pypi.org/project/hacs-core/)**: Core HACS data models and utilities\n- **[hacs-models](https://pypi.org/project/hacs-models/)**: Clinical data models (Patient, Observation, etc.)\n- **[hacs-langgraph](https://pypi.org/project/hacs-langgraph/)**: LangGraph workflow integration\n- **[hacs-crewai](https://pypi.org/project/hacs-crewai/)**: CrewAI multi-agent workflows\n- **[hacs-anthropic](https://pypi.org/project/hacs-anthropic/)**: Anthropic AI integration\n- **[hacs-openai](https://pypi.org/project/hacs-openai/)**: OpenAI integration\n\n## License\n\nLicensed under the Apache License, Version 2.0. See [LICENSE](https://github.com/solanovisitor/hacs/blob/main/LICENSE) for details.\n\n## Contributing\n\nSee [Contributing Guidelines](https://github.com/solanovisitor/hacs/blob/main/docs/contributing/guidelines.md) for information on how to contribute to HACS Tools.\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Core tools and utilities for HACS (Healthcare Agent Communication Standard)",
    "version": "0.2.3",
    "project_urls": {
        "Bug Tracker": "https://github.com/solanovisitor/hacs/issues",
        "Changelog": "https://github.com/solanovisitor/hacs/blob/main/docs/reference/changelog.md",
        "Documentation": "https://github.com/solanovisitor/hacs/blob/main/docs/README.md",
        "Homepage": "https://github.com/solanovisitor/hacs",
        "Repository": "https://github.com/solanovisitor/hacs"
    },
    "split_keywords": [
        "crud",
        " evidence",
        " healthcare",
        " memory",
        " search",
        " tools",
        " validation"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "d338df866e8d4d8700c426d097ae62a38b245fea49b7decf669f172872530ee8",
                "md5": "4dffb7ddd3cccdaac44014d759aa93e8",
                "sha256": "d5b4eae5ff02e507e71f6f04449568d8a3f0f5ac675d5f50b9cac3a475248462"
            },
            "downloads": -1,
            "filename": "hacs_tools-0.2.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4dffb7ddd3cccdaac44014d759aa93e8",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 28304,
            "upload_time": "2025-07-09T16:24:14",
            "upload_time_iso_8601": "2025-07-09T16:24:14.677964Z",
            "url": "https://files.pythonhosted.org/packages/d3/38/df866e8d4d8700c426d097ae62a38b245fea49b7decf669f172872530ee8/hacs_tools-0.2.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "f579749b091b9204bc097cd102c0281db229b8da08bb3b4dd13360391a6446aa",
                "md5": "4c5ab48f9ca5ed307866ce58282a44d6",
                "sha256": "442ba94059a1231119e4dc99f13acfa85a62136e81c45cbaffdca21e1841ab7f"
            },
            "downloads": -1,
            "filename": "hacs_tools-0.2.3.tar.gz",
            "has_sig": false,
            "md5_digest": "4c5ab48f9ca5ed307866ce58282a44d6",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 25429,
            "upload_time": "2025-07-09T16:24:16",
            "upload_time_iso_8601": "2025-07-09T16:24:16.672528Z",
            "url": "https://files.pythonhosted.org/packages/f5/79/749b091b9204bc097cd102c0281db229b8da08bb3b4dd13360391a6446aa/hacs_tools-0.2.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-09 16:24:16",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "solanovisitor",
    "github_project": "hacs",
    "github_not_found": true,
    "lcname": "hacs-tools"
}
        
Elapsed time: 0.48253s