# 🎯 Casting Expert
A comprehensive Python package for type casting, conversion, and validation with built-in CLI support - perfect for developers, data scientists, and system administrators.
## 🌟 Features
### 🔄 Core Features
- Advanced type casting with fallback options
- Comprehensive input validation
- Type inference for automatic conversion
- Multiple format support (JSON, YAML, Query String)
- Nested dictionary handling
- Error handling and validation
### 🛠️ CLI Features
- Multiple input methods (string, file, stdin)
- Multiple output formats
- Pretty printing options
- File input/output support
- Quiet mode operation
### 🎯 Target Audiences
- 👨💻 Software Developers
- 📊 Data Scientists
- 🔧 System Administrators
- 👥 IT Professionals
## 📦 Installation
### Basic Installation
```bash
pip install casting-expert
```
### 🚀 Optional Features
Choose the installation that best suits your needs:
```bash
# 📄 YAML Support (YAML parsing and output)
pip install "casting-expert[yaml]"
# 📊 Data Science Tools (pandas, numpy integration)
pip install "casting-expert[data]"
# 🌐 Web Development (requests, aiohttp for API integration)
pip install "casting-expert[web]"
# 🔧 Development Tools (testing, linting, type checking)
pip install "casting-expert[dev]"
# ⭐ All Features (complete installation)
pip install "casting-expert[full]"
```
## 🔧 Module Usage
### 1. 🎯 Basic Type Casting
```python
from casting_expert import safe_cast, cast_to_type
# Safe casting with None on failure
result1 = safe_cast("123", int) # Returns: 123
result2 = safe_cast("invalid", int) # Returns: None
# Casting with default values
result3 = cast_to_type("123", int, default=0) # Returns: 123
result4 = cast_to_type("invalid", int, default=0) # Returns: 0
# Type casting with validation
from casting_expert import validate_input
is_valid = validate_input("123", int) # Returns: True
can_cast = validate_input("abc", int) # Returns: False
```
### 2. 📝 String to Dictionary Conversion
```python
from casting_expert import parse_string_to_dict, ParsingError
# Simple JSON parsing
json_str = '{"name": "John", "age": 30}'
try:
data = parse_string_to_dict(json_str)
print(data) # {'name': 'John', 'age': 30}
except ParsingError as e:
print(f"Error: {e}")
# Different format support
from casting_expert import (
parse_json,
parse_yaml_like,
parse_query_string,
parse_key_value_pairs
)
# 📋 JSON format
json_data = parse_json('{"name": "John"}')
# 📄 YAML-like format
yaml_data = parse_yaml_like("""
name: John
age: 30
nested:
key: value
""")
# 🔍 Query string format
query_data = parse_query_string("name=John&age=30&tags=python,coding")
# 📑 Key-value pairs
kv_data = parse_key_value_pairs("""
name: John
age: 30
""")
```
### 3. 🔄 Type Inference
```python
from casting_expert import TypeInference
# Automatic type detection
raw_data = {
"id": "123",
"active": "true",
"score": "98.6",
"tags": "python,coding",
"date": "2024-03-12"
}
typed_data = TypeInference.infer_types_in_dict(raw_data)
# Result:
# {
# "id": 123, # Integer
# "active": True, # Boolean
# "score": 98.6, # Float
# "tags": ["python", "coding"], # List
# "date": datetime(2024, 3, 12) # DateTime
# }
# Single value inference
number = TypeInference.infer_type("123") # Returns: 123
boolean = TypeInference.infer_type("true") # Returns: True
date = TypeInference.infer_type("2024-03-12") # Returns: datetime object
```
### 4. ✅ Dictionary Validation
```python
from casting_expert import DictValidator, ValidationError
# Create validation schema
user_schema = {
"name": DictValidator.create_field(
str,
required=True,
min_length=2,
pattern=r'^[A-Za-z\s]+$',
error_messages={
"pattern": "Name should contain only letters and spaces",
"required": "Name is required"
}
),
"age": DictValidator.create_field(
int,
min_value=0,
max_value=150,
error_messages={
"min_value": "Age cannot be negative",
"max_value": "Age cannot be greater than 150"
}
),
"email": DictValidator.create_field(
str,
required=True,
pattern=r'^[\w\.-]+@[\w\.-]+\.\w+$',
error_messages={"pattern": "Invalid email format"}
).add_validator(
lambda x: not x.endswith('.temp'),
"Temporary email domains are not allowed"
)
}
# Validate data
try:
result = DictValidator.validate(data, user_schema)
if result.is_valid:
print("✅ Validation passed!")
else:
for issue in result.issues:
print(f"⚠️ {issue.field}: {issue.message}")
except ValidationError as e:
print(f"❌ Validation failed: {e}")
```
### 5. 💾 Dictionary Serialization
```python
from casting_expert import DictSerializer
data = {
"name": "John",
"age": 30,
"scores": [95, 87, 91],
"details": {
"city": "New York",
"role": "developer"
}
}
# JSON output (pretty-printed)
json_str = DictSerializer.to_json(data, pretty=True)
# Query string format
query_str = DictSerializer.to_query_string(data, prefix='?')
# YAML format
yaml_str = DictSerializer.to_yaml_like(data)
# Key-value format
kv_str = DictSerializer.to_key_value(data, delimiter=': ')
```
## 🖥️ CLI Usage
### Basic Commands
1. **📝 Parse String Input**
```bash
# Simple parsing
casting-expert -s '{"name": "John", "age": 30}'
# Pretty printing
casting-expert -s '{"name": "John"}' --pretty --indent 4
```
2. **📂 File Operations**
```bash
# Read from file
casting-expert -f input.json
# Write to file
casting-expert -f input.json -o output.json
# Convert JSON to YAML
casting-expert -f input.json --format yaml -o output.yaml
```
3. **📊 Format Options**
```bash
# Output as YAML
casting-expert -s '{"name": "John"}' --format yaml
# Output as Python dict
casting-expert -s '{"name": "John"}' --format python
# Pretty JSON
casting-expert -s '{"name": "John"}' --pretty
```
4. **📥 Standard Input**
```bash
# Pipe input
echo '{"name": "John"}' | casting-expert -i
# Redirect input
casting-expert -i < input.json
```
### CLI Options Reference
```
📋 Required Options (choose one):
-s, --string STRING Input string to parse
-f, --file FILE Input file path
-i, --stdin Read from stdin
📝 Output Options:
-o, --output OUTPUT Output file path
--format FORMAT Output format (json|yaml|python)
--indent INDENT Indentation spaces (default: 2)
--pretty Enable pretty printing
-q, --quiet Suppress non-error output
```
## 📁 Package Structure
```
src/
├── casting_expert/ # Main package directory
│ ├── __init__.py # Package initialization
│ ├── cli.py # CLI implementation
│ ├── core.py # Core casting functions
│ ├── validators.py # Input validation
│ └── casters/ # Specialized casters
│ ├── __init__.py
│ ├── parsers.py # String parsing
│ ├── serializers.py # Data serialization
│ ├── type_inference.py # Type detection
│ └── validators.py # Data validation
```
# 📚 Advanced Use Cases & Examples
## 🔄 Data Processing
### 1. API Response Processing
```python
from casting_expert import parse_string_to_dict, TypeInference
def process_api_response():
# Sample API response
response = '''
{
"status": "success",
"code": "200",
"data": {
"user_id": "12345",
"is_active": "true",
"last_login": "2024-03-12T10:30:00Z",
"metrics": {
"visits": "1000",
"conversion_rate": "0.15"
}
}
}
'''
# Parse and infer types
data = parse_string_to_dict(response)
typed_data = TypeInference.infer_types_in_dict(data)
# Access strongly-typed data
user_id = typed_data['data']['user_id'] # int: 12345
is_active = typed_data['data']['is_active'] # bool: True
conversion = typed_data['data']['metrics']['conversion_rate'] # float: 0.15
### 2. Configuration Management
```python
from casting_expert import parse_yaml_like, DictValidator
# Define configuration schema
config_schema = {
"database": DictValidator.create_field(
dict,
schema={
"host": DictValidator.create_field(str, required=True),
"port": DictValidator.create_field(int, min_value=1, max_value=65535),
"credentials": DictValidator.create_field(
dict,
schema={
"username": DictValidator.create_field(str, required=True),
"password": DictValidator.create_field(str, required=True)
}
)
}
),
"cache": DictValidator.create_field(
dict,
schema={
"enabled": DictValidator.create_field(bool, required=True),
"ttl": DictValidator.create_field(int, min_value=0)
}
)
}
# Load and validate configuration
config_str = '''
database:
host: localhost
port: 5432
credentials:
username: admin
password: secret123
cache:
enabled: true
ttl: 3600
'''
config = parse_yaml_like(config_str)
validation_result = DictValidator.validate(config, config_schema)
```
### 3. Data Analysis Pipeline
```python
import pandas as pd
from casting_expert import parse_string_to_dict, TypeInference
def analyze_data():
# Sample data
data_str = '''
{
"sales_data": [
{"date": "2024-03-01", "revenue": "1000.50", "units": "50"},
{"date": "2024-03-02", "revenue": "1500.75", "units": "75"},
{"date": "2024-03-03", "revenue": "1250.25", "units": "60"}
],
"metadata": {
"currency": "USD",
"store_id": "123"
}
}
'''
# Parse and process
data = parse_string_to_dict(data_str)
typed_data = TypeInference.infer_types_in_dict(data)
# Convert to pandas DataFrame
df = pd.DataFrame(typed_data['sales_data'])
# Analysis
total_revenue = df['revenue'].sum()
avg_units = df['units'].mean()
return df, total_revenue, avg_units
### 4. Log Processing
```python
from casting_expert import parse_string_to_dict, DictSerializer
def process_logs():
# Sample log entry
log_entry = '''
{
"timestamp": "2024-03-12T10:30:00Z",
"level": "ERROR",
"service": "authentication",
"message": "Login failed",
"metadata": {
"user_id": "12345",
"ip": "192.168.1.1",
"attempts": "3"
}
}
'''
# Parse and enhance
log = parse_string_to_dict(log_entry)
typed_log = TypeInference.infer_types_in_dict(log)
# Transform for storage
enhanced_log = {
**typed_log,
"processed_at": datetime.now().isoformat(),
"severity": 5 if typed_log['level'] == 'ERROR' else 3
}
# Serialize for storage
return DictSerializer.to_json(enhanced_log)
```
### 5. Form Data Processing
```python
from casting_expert import parse_query_string, DictValidator
def process_form():
# Sample form data
form_data = "name=John+Doe&age=30&email=john%40example.com&subscribe=true"
# Parse query string
data = parse_query_string(form_data)
# Validate form data
form_schema = {
"name": DictValidator.create_field(str, required=True, min_length=2),
"age": DictValidator.create_field(int, min_value=18),
"email": DictValidator.create_field(
str,
pattern=r'^[\w\.-]+@[\w\.-]+\.\w+$'
),
"subscribe": DictValidator.create_field(bool)
}
validation_result = DictValidator.validate(data, form_schema)
return validation_result.is_valid, data
```
### 6. Data Migration
```python
from casting_expert import (
parse_string_to_dict,
DictSerializer,
TypeInference
)
def migrate_data():
# Old format
old_data = '''
{
"user": {
"firstName": "John",
"lastName": "Doe",
"isActive": "1",
"loginCount": "42"
}
}
'''
# Parse and transform
data = parse_string_to_dict(old_data)
typed_data = TypeInference.infer_types_in_dict(data)
# New format
new_data = {
"profile": {
"full_name": f"{typed_data['user']['firstName']} {typed_data['user']['lastName']}",
"active": bool(typed_data['user']['isActive']),
"stats": {
"logins": typed_data['user']['loginCount']
}
}
}
# Output in different formats
return {
"json": DictSerializer.to_json(new_data),
"yaml": DictSerializer.to_yaml_like(new_data),
"query": DictSerializer.to_query_string(new_data)
}
```
# 🔧 Troubleshooting Guide
## Common Issues and Solutions
### 1. Parsing Errors
#### Issue: Invalid JSON Format
```python
ParsingError: Invalid dictionary format: Expecting property name enclosed in double quotes
```
**Solution**:
- Ensure all keys are enclosed in double quotes
- Check for missing or extra commas
- Validate JSON syntax using a JSON validator
**Example Fix**:
```python
# Invalid
data = parse_string_to_dict('{name: "John"}')
# Valid
data = parse_string_to_dict('{"name": "John"}')
```
### 2. Type Inference Issues
#### Issue: Unexpected Type Inference
```python
# Data contains number-like strings that should remain strings
data = {"id": "001", "code": "123456"}
```
**Solution**:
Use explicit type casting or custom validation:
```python
from casting_expert import DictValidator
schema = {
"id": DictValidator.create_field(str), # Force string type
"code": DictValidator.create_field(str)
}
```
### 3. Validation Errors
#### Issue: Complex Validation Requirements
```python
ValidationError: Invalid value for field 'email'
```
**Solution**:
Use custom validators:
```python
def validate_email_domain(email: str) -> bool:
return email.endswith(('@company.com', '@company.org'))
schema = {
"email": DictValidator.create_field(
str,
pattern=r'^[\w\.-]+@[\w\.-]+\.\w+$'
).add_validator(
validate_email_domain,
"Email must be from company domain"
)
}
```
### 4. CLI Issues
#### Issue: YAML Output Not Working
```bash
Warning: PyYAML not installed. Defaulting to JSON format.
```
**Solution**:
Install YAML support:
```bash
pip install "casting-expert[yaml]"
```
### 5. Performance Issues
#### Issue: Slow Processing of Large Files
**Solution**:
- Use streaming for large files
- Process data in chunks
- Use appropriate format options
```python
def process_large_file(filepath: str):
with open(filepath, 'r') as f:
for line in f:
try:
data = parse_string_to_dict(line.strip())
# Process each line
yield data
except ParsingError:
continue
```
### 6. Module Import Issues
#### Issue: Module Not Found
**Solution**:
- Verify installation:
```bash
pip show casting-expert
```
- Check Python path
- Verify virtual environment activation
### 7. Common Error Messages
#### `ParsingError: Invalid dictionary format`
- Check input string format
- Verify quotes and delimiters
- Ensure valid nesting
#### `ValidationError: Required field missing`
- Check schema requirements
- Verify all required fields are present
- Check field names case sensitivity
#### `TypeError: Object of type X is not JSON serializable`
- Use appropriate serialization method
- Convert custom objects to basic types
- Implement custom serializers if needed
## 🤝 Contributing
Contributions are welcome! See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.
## 📄 License
MIT License - See [LICENSE](LICENSE) file for details.
## 📬 Contact & Support
- 📃 Documentation: [Read the Docs](https://github.com/ahmednizami/casting-expert/)
- 🐛 Issues: [GitHub Issues](https://github.com/ahmednizami/casting-expert/issues)
- 💻 Source: [GitHub](https://github.com/ahmednizami/casting-expert)
- 📧 Email: ahmednizami2021@gmailcom
Raw data
{
"_id": null,
"home_page": null,
"name": "casting-expert",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "cli, data conversion, data processing, data transformation, dictionary conversion, json, schema validation, string parsing, type casting, type inference, validation, yaml",
"author": null,
"author_email": "Muhammad Ahmed Saeed <ahmednizami2021@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/4e/0e/fbc9e6f69b3baf72b24695ede4bc0b855f3cf3b9d301ba833a069fdf9a74/casting_expert-0.1.7.tar.gz",
"platform": null,
"description": "# \ud83c\udfaf Casting Expert\n\nA comprehensive Python package for type casting, conversion, and validation with built-in CLI support - perfect for developers, data scientists, and system administrators.\n\n## \ud83c\udf1f Features\n\n### \ud83d\udd04 Core Features\n- Advanced type casting with fallback options\n- Comprehensive input validation\n- Type inference for automatic conversion\n- Multiple format support (JSON, YAML, Query String)\n- Nested dictionary handling\n- Error handling and validation\n\n### \ud83d\udee0\ufe0f CLI Features\n- Multiple input methods (string, file, stdin)\n- Multiple output formats\n- Pretty printing options\n- File input/output support\n- Quiet mode operation\n\n### \ud83c\udfaf Target Audiences\n- \ud83d\udc68\u200d\ud83d\udcbb Software Developers\n- \ud83d\udcca Data Scientists\n- \ud83d\udd27 System Administrators\n- \ud83d\udc65 IT Professionals\n\n## \ud83d\udce6 Installation\n\n### Basic Installation\n```bash\npip install casting-expert\n```\n\n### \ud83d\ude80 Optional Features\n\nChoose the installation that best suits your needs:\n\n```bash\n# \ud83d\udcc4 YAML Support (YAML parsing and output)\npip install \"casting-expert[yaml]\"\n\n# \ud83d\udcca Data Science Tools (pandas, numpy integration)\npip install \"casting-expert[data]\"\n\n# \ud83c\udf10 Web Development (requests, aiohttp for API integration)\npip install \"casting-expert[web]\"\n\n# \ud83d\udd27 Development Tools (testing, linting, type checking)\npip install \"casting-expert[dev]\"\n\n# \u2b50 All Features (complete installation)\npip install \"casting-expert[full]\"\n```\n\n## \ud83d\udd27 Module Usage\n\n### 1. \ud83c\udfaf Basic Type Casting\n\n```python\nfrom casting_expert import safe_cast, cast_to_type\n\n# Safe casting with None on failure\nresult1 = safe_cast(\"123\", int) # Returns: 123\nresult2 = safe_cast(\"invalid\", int) # Returns: None\n\n# Casting with default values\nresult3 = cast_to_type(\"123\", int, default=0) # Returns: 123\nresult4 = cast_to_type(\"invalid\", int, default=0) # Returns: 0\n\n# Type casting with validation\nfrom casting_expert import validate_input\n\nis_valid = validate_input(\"123\", int) # Returns: True\ncan_cast = validate_input(\"abc\", int) # Returns: False\n```\n\n### 2. \ud83d\udcdd String to Dictionary Conversion\n\n```python\nfrom casting_expert import parse_string_to_dict, ParsingError\n\n# Simple JSON parsing\njson_str = '{\"name\": \"John\", \"age\": 30}'\ntry:\n data = parse_string_to_dict(json_str)\n print(data) # {'name': 'John', 'age': 30}\nexcept ParsingError as e:\n print(f\"Error: {e}\")\n\n# Different format support\nfrom casting_expert import (\n parse_json,\n parse_yaml_like,\n parse_query_string,\n parse_key_value_pairs\n)\n\n# \ud83d\udccb JSON format\njson_data = parse_json('{\"name\": \"John\"}')\n\n# \ud83d\udcc4 YAML-like format\nyaml_data = parse_yaml_like(\"\"\"\nname: John\nage: 30\nnested:\n key: value\n\"\"\")\n\n# \ud83d\udd0d Query string format\nquery_data = parse_query_string(\"name=John&age=30&tags=python,coding\")\n\n# \ud83d\udcd1 Key-value pairs\nkv_data = parse_key_value_pairs(\"\"\"\nname: John\nage: 30\n\"\"\")\n```\n\n### 3. \ud83d\udd04 Type Inference\n\n```python\nfrom casting_expert import TypeInference\n\n# Automatic type detection\nraw_data = {\n \"id\": \"123\",\n \"active\": \"true\",\n \"score\": \"98.6\",\n \"tags\": \"python,coding\",\n \"date\": \"2024-03-12\"\n}\n\ntyped_data = TypeInference.infer_types_in_dict(raw_data)\n# Result:\n# {\n# \"id\": 123, # Integer\n# \"active\": True, # Boolean\n# \"score\": 98.6, # Float\n# \"tags\": [\"python\", \"coding\"], # List\n# \"date\": datetime(2024, 3, 12) # DateTime\n# }\n\n# Single value inference\nnumber = TypeInference.infer_type(\"123\") # Returns: 123\nboolean = TypeInference.infer_type(\"true\") # Returns: True\ndate = TypeInference.infer_type(\"2024-03-12\") # Returns: datetime object\n```\n\n### 4. \u2705 Dictionary Validation\n\n```python\nfrom casting_expert import DictValidator, ValidationError\n\n# Create validation schema\nuser_schema = {\n \"name\": DictValidator.create_field(\n str,\n required=True,\n min_length=2,\n pattern=r'^[A-Za-z\\s]+$',\n error_messages={\n \"pattern\": \"Name should contain only letters and spaces\",\n \"required\": \"Name is required\"\n }\n ),\n \"age\": DictValidator.create_field(\n int,\n min_value=0,\n max_value=150,\n error_messages={\n \"min_value\": \"Age cannot be negative\",\n \"max_value\": \"Age cannot be greater than 150\"\n }\n ),\n \"email\": DictValidator.create_field(\n str,\n required=True,\n pattern=r'^[\\w\\.-]+@[\\w\\.-]+\\.\\w+$',\n error_messages={\"pattern\": \"Invalid email format\"}\n ).add_validator(\n lambda x: not x.endswith('.temp'),\n \"Temporary email domains are not allowed\"\n )\n}\n\n# Validate data\ntry:\n result = DictValidator.validate(data, user_schema)\n if result.is_valid:\n print(\"\u2705 Validation passed!\")\n else:\n for issue in result.issues:\n print(f\"\u26a0\ufe0f {issue.field}: {issue.message}\")\nexcept ValidationError as e:\n print(f\"\u274c Validation failed: {e}\")\n```\n\n### 5. \ud83d\udcbe Dictionary Serialization\n\n```python\nfrom casting_expert import DictSerializer\n\ndata = {\n \"name\": \"John\",\n \"age\": 30,\n \"scores\": [95, 87, 91],\n \"details\": {\n \"city\": \"New York\",\n \"role\": \"developer\"\n }\n}\n\n# JSON output (pretty-printed)\njson_str = DictSerializer.to_json(data, pretty=True)\n\n# Query string format\nquery_str = DictSerializer.to_query_string(data, prefix='?')\n\n# YAML format\nyaml_str = DictSerializer.to_yaml_like(data)\n\n# Key-value format\nkv_str = DictSerializer.to_key_value(data, delimiter=': ')\n```\n\n\n\n## \ud83d\udda5\ufe0f CLI Usage\n\n### Basic Commands\n\n1. **\ud83d\udcdd Parse String Input**\n```bash\n# Simple parsing\ncasting-expert -s '{\"name\": \"John\", \"age\": 30}'\n\n# Pretty printing\ncasting-expert -s '{\"name\": \"John\"}' --pretty --indent 4\n```\n\n2. **\ud83d\udcc2 File Operations**\n```bash\n# Read from file\ncasting-expert -f input.json\n\n# Write to file\ncasting-expert -f input.json -o output.json\n\n# Convert JSON to YAML\ncasting-expert -f input.json --format yaml -o output.yaml\n```\n\n3. **\ud83d\udcca Format Options**\n```bash\n# Output as YAML\ncasting-expert -s '{\"name\": \"John\"}' --format yaml\n\n# Output as Python dict\ncasting-expert -s '{\"name\": \"John\"}' --format python\n\n# Pretty JSON\ncasting-expert -s '{\"name\": \"John\"}' --pretty\n```\n\n4. **\ud83d\udce5 Standard Input**\n```bash\n# Pipe input\necho '{\"name\": \"John\"}' | casting-expert -i\n\n# Redirect input\ncasting-expert -i < input.json\n```\n\n### CLI Options Reference\n\n```\n\ud83d\udccb Required Options (choose one):\n -s, --string STRING Input string to parse\n -f, --file FILE Input file path\n -i, --stdin Read from stdin\n\n\ud83d\udcdd Output Options:\n -o, --output OUTPUT Output file path\n --format FORMAT Output format (json|yaml|python)\n --indent INDENT Indentation spaces (default: 2)\n --pretty Enable pretty printing\n -q, --quiet Suppress non-error output\n```\n\n## \ud83d\udcc1 Package Structure\n\n```\nsrc/\n\u251c\u2500\u2500 casting_expert/ # Main package directory\n\u2502 \u251c\u2500\u2500 __init__.py # Package initialization\n\u2502 \u251c\u2500\u2500 cli.py # CLI implementation\n\u2502 \u251c\u2500\u2500 core.py # Core casting functions\n\u2502 \u251c\u2500\u2500 validators.py # Input validation\n\u2502 \u2514\u2500\u2500 casters/ # Specialized casters\n\u2502 \u251c\u2500\u2500 __init__.py\n\u2502 \u251c\u2500\u2500 parsers.py # String parsing\n\u2502 \u251c\u2500\u2500 serializers.py # Data serialization\n\u2502 \u251c\u2500\u2500 type_inference.py # Type detection\n\u2502 \u2514\u2500\u2500 validators.py # Data validation\n```\n\n# \ud83d\udcda Advanced Use Cases & Examples\n\n## \ud83d\udd04 Data Processing\n\n### 1. API Response Processing\n```python\nfrom casting_expert import parse_string_to_dict, TypeInference\n\ndef process_api_response():\n # Sample API response\n response = '''\n {\n \"status\": \"success\",\n \"code\": \"200\",\n \"data\": {\n \"user_id\": \"12345\",\n \"is_active\": \"true\",\n \"last_login\": \"2024-03-12T10:30:00Z\",\n \"metrics\": {\n \"visits\": \"1000\",\n \"conversion_rate\": \"0.15\"\n }\n }\n }\n '''\n \n # Parse and infer types\n data = parse_string_to_dict(response)\n typed_data = TypeInference.infer_types_in_dict(data)\n \n # Access strongly-typed data\n user_id = typed_data['data']['user_id'] # int: 12345\n is_active = typed_data['data']['is_active'] # bool: True\n conversion = typed_data['data']['metrics']['conversion_rate'] # float: 0.15\n\n### 2. Configuration Management\n```python\nfrom casting_expert import parse_yaml_like, DictValidator\n\n# Define configuration schema\nconfig_schema = {\n \"database\": DictValidator.create_field(\n dict,\n schema={\n \"host\": DictValidator.create_field(str, required=True),\n \"port\": DictValidator.create_field(int, min_value=1, max_value=65535),\n \"credentials\": DictValidator.create_field(\n dict,\n schema={\n \"username\": DictValidator.create_field(str, required=True),\n \"password\": DictValidator.create_field(str, required=True)\n }\n )\n }\n ),\n \"cache\": DictValidator.create_field(\n dict,\n schema={\n \"enabled\": DictValidator.create_field(bool, required=True),\n \"ttl\": DictValidator.create_field(int, min_value=0)\n }\n )\n}\n\n# Load and validate configuration\nconfig_str = '''\ndatabase:\n host: localhost\n port: 5432\n credentials:\n username: admin\n password: secret123\ncache:\n enabled: true\n ttl: 3600\n'''\n\nconfig = parse_yaml_like(config_str)\nvalidation_result = DictValidator.validate(config, config_schema)\n```\n\n### 3. Data Analysis Pipeline\n```python\nimport pandas as pd\nfrom casting_expert import parse_string_to_dict, TypeInference\n\ndef analyze_data():\n # Sample data\n data_str = '''\n {\n \"sales_data\": [\n {\"date\": \"2024-03-01\", \"revenue\": \"1000.50\", \"units\": \"50\"},\n {\"date\": \"2024-03-02\", \"revenue\": \"1500.75\", \"units\": \"75\"},\n {\"date\": \"2024-03-03\", \"revenue\": \"1250.25\", \"units\": \"60\"}\n ],\n \"metadata\": {\n \"currency\": \"USD\",\n \"store_id\": \"123\"\n }\n }\n '''\n \n # Parse and process\n data = parse_string_to_dict(data_str)\n typed_data = TypeInference.infer_types_in_dict(data)\n \n # Convert to pandas DataFrame\n df = pd.DataFrame(typed_data['sales_data'])\n \n # Analysis\n total_revenue = df['revenue'].sum()\n avg_units = df['units'].mean()\n return df, total_revenue, avg_units\n\n### 4. Log Processing\n```python\nfrom casting_expert import parse_string_to_dict, DictSerializer\n\ndef process_logs():\n # Sample log entry\n log_entry = '''\n {\n \"timestamp\": \"2024-03-12T10:30:00Z\",\n \"level\": \"ERROR\",\n \"service\": \"authentication\",\n \"message\": \"Login failed\",\n \"metadata\": {\n \"user_id\": \"12345\",\n \"ip\": \"192.168.1.1\",\n \"attempts\": \"3\"\n }\n }\n '''\n \n # Parse and enhance\n log = parse_string_to_dict(log_entry)\n typed_log = TypeInference.infer_types_in_dict(log)\n \n # Transform for storage\n enhanced_log = {\n **typed_log,\n \"processed_at\": datetime.now().isoformat(),\n \"severity\": 5 if typed_log['level'] == 'ERROR' else 3\n }\n \n # Serialize for storage\n return DictSerializer.to_json(enhanced_log)\n```\n\n### 5. Form Data Processing\n```python\nfrom casting_expert import parse_query_string, DictValidator\n\ndef process_form():\n # Sample form data\n form_data = \"name=John+Doe&age=30&email=john%40example.com&subscribe=true\"\n \n # Parse query string\n data = parse_query_string(form_data)\n \n # Validate form data\n form_schema = {\n \"name\": DictValidator.create_field(str, required=True, min_length=2),\n \"age\": DictValidator.create_field(int, min_value=18),\n \"email\": DictValidator.create_field(\n str,\n pattern=r'^[\\w\\.-]+@[\\w\\.-]+\\.\\w+$'\n ),\n \"subscribe\": DictValidator.create_field(bool)\n }\n \n validation_result = DictValidator.validate(data, form_schema)\n return validation_result.is_valid, data\n```\n\n### 6. Data Migration\n```python\nfrom casting_expert import (\n parse_string_to_dict,\n DictSerializer,\n TypeInference\n)\n\ndef migrate_data():\n # Old format\n old_data = '''\n {\n \"user\": {\n \"firstName\": \"John\",\n \"lastName\": \"Doe\",\n \"isActive\": \"1\",\n \"loginCount\": \"42\"\n }\n }\n '''\n \n # Parse and transform\n data = parse_string_to_dict(old_data)\n typed_data = TypeInference.infer_types_in_dict(data)\n \n # New format\n new_data = {\n \"profile\": {\n \"full_name\": f\"{typed_data['user']['firstName']} {typed_data['user']['lastName']}\",\n \"active\": bool(typed_data['user']['isActive']),\n \"stats\": {\n \"logins\": typed_data['user']['loginCount']\n }\n }\n }\n \n # Output in different formats\n return {\n \"json\": DictSerializer.to_json(new_data),\n \"yaml\": DictSerializer.to_yaml_like(new_data),\n \"query\": DictSerializer.to_query_string(new_data)\n }\n```\n\n# \ud83d\udd27 Troubleshooting Guide\n\n## Common Issues and Solutions\n\n### 1. Parsing Errors\n\n#### Issue: Invalid JSON Format\n```python\nParsingError: Invalid dictionary format: Expecting property name enclosed in double quotes\n```\n\n**Solution**:\n- Ensure all keys are enclosed in double quotes\n- Check for missing or extra commas\n- Validate JSON syntax using a JSON validator\n\n**Example Fix**:\n```python\n# Invalid\ndata = parse_string_to_dict('{name: \"John\"}')\n\n# Valid\ndata = parse_string_to_dict('{\"name\": \"John\"}')\n```\n\n### 2. Type Inference Issues\n\n#### Issue: Unexpected Type Inference\n```python\n# Data contains number-like strings that should remain strings\ndata = {\"id\": \"001\", \"code\": \"123456\"}\n```\n\n**Solution**:\nUse explicit type casting or custom validation:\n```python\nfrom casting_expert import DictValidator\n\nschema = {\n \"id\": DictValidator.create_field(str), # Force string type\n \"code\": DictValidator.create_field(str)\n}\n```\n\n### 3. Validation Errors\n\n#### Issue: Complex Validation Requirements\n```python\nValidationError: Invalid value for field 'email'\n```\n\n**Solution**:\nUse custom validators:\n```python\ndef validate_email_domain(email: str) -> bool:\n return email.endswith(('@company.com', '@company.org'))\n\nschema = {\n \"email\": DictValidator.create_field(\n str,\n pattern=r'^[\\w\\.-]+@[\\w\\.-]+\\.\\w+$'\n ).add_validator(\n validate_email_domain,\n \"Email must be from company domain\"\n )\n}\n```\n\n### 4. CLI Issues\n\n#### Issue: YAML Output Not Working\n```bash\nWarning: PyYAML not installed. Defaulting to JSON format.\n```\n\n**Solution**:\nInstall YAML support:\n```bash\npip install \"casting-expert[yaml]\"\n```\n\n### 5. Performance Issues\n\n#### Issue: Slow Processing of Large Files\n\n**Solution**:\n- Use streaming for large files\n- Process data in chunks\n- Use appropriate format options\n\n```python\ndef process_large_file(filepath: str):\n with open(filepath, 'r') as f:\n for line in f:\n try:\n data = parse_string_to_dict(line.strip())\n # Process each line\n yield data\n except ParsingError:\n continue\n```\n\n### 6. Module Import Issues\n\n#### Issue: Module Not Found\n\n**Solution**:\n- Verify installation:\n```bash\npip show casting-expert\n```\n- Check Python path\n- Verify virtual environment activation\n\n### 7. Common Error Messages\n\n#### `ParsingError: Invalid dictionary format`\n- Check input string format\n- Verify quotes and delimiters\n- Ensure valid nesting\n\n#### `ValidationError: Required field missing`\n- Check schema requirements\n- Verify all required fields are present\n- Check field names case sensitivity\n\n#### `TypeError: Object of type X is not JSON serializable`\n- Use appropriate serialization method\n- Convert custom objects to basic types\n- Implement custom serializers if needed\n\n## \ud83e\udd1d Contributing\n\nContributions are welcome! See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.\n\n## \ud83d\udcc4 License\n\nMIT License - See [LICENSE](LICENSE) file for details.\n\n## \ud83d\udcec Contact & Support\n\n- \ud83d\udcc3 Documentation: [Read the Docs](https://github.com/ahmednizami/casting-expert/)\n- \ud83d\udc1b Issues: [GitHub Issues](https://github.com/ahmednizami/casting-expert/issues)\n- \ud83d\udcbb Source: [GitHub](https://github.com/ahmednizami/casting-expert)\n- \ud83d\udce7 Email: ahmednizami2021@gmailcom\n",
"bugtrack_url": null,
"license": null,
"summary": "A comprehensive Python package for type casting, conversion, and validation with advanced features",
"version": "0.1.7",
"project_urls": {
"Bug Tracker": "https://github.com/ahmednizami/casting-expert/issues",
"Changelog": "https://github.com/ahmednizami/casting-expert/blob/main/CHANGELOG.md",
"Documentation": "https://github.com/ahmednizami/casting-expert",
"Homepage": "https://github.com/ahmednizami/casting-expert"
},
"split_keywords": [
"cli",
" data conversion",
" data processing",
" data transformation",
" dictionary conversion",
" json",
" schema validation",
" string parsing",
" type casting",
" type inference",
" validation",
" yaml"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "9f8ee003867263b2fce1cd8538c8d368fd05bea884a51719a5f8f7901d2ea259",
"md5": "b168e9477b8ec239c8123731339cd8d6",
"sha256": "d9c5a61495e4b3447596c52d4f3df38fadeceb66cffefdbada874615ac4c6ee2"
},
"downloads": -1,
"filename": "casting_expert-0.1.7-py3-none-any.whl",
"has_sig": false,
"md5_digest": "b168e9477b8ec239c8123731339cd8d6",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 15536,
"upload_time": "2024-11-12T08:18:19",
"upload_time_iso_8601": "2024-11-12T08:18:19.411550Z",
"url": "https://files.pythonhosted.org/packages/9f/8e/e003867263b2fce1cd8538c8d368fd05bea884a51719a5f8f7901d2ea259/casting_expert-0.1.7-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "4e0efbc9e6f69b3baf72b24695ede4bc0b855f3cf3b9d301ba833a069fdf9a74",
"md5": "a8e410fb5369c92bf63a6a4b29371028",
"sha256": "cbde284ec6ed01d5da8c868a5ee551f88f92b9c73bc537bff80817ae7fb05f21"
},
"downloads": -1,
"filename": "casting_expert-0.1.7.tar.gz",
"has_sig": false,
"md5_digest": "a8e410fb5369c92bf63a6a4b29371028",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 13605,
"upload_time": "2024-11-12T08:18:21",
"upload_time_iso_8601": "2024-11-12T08:18:21.630471Z",
"url": "https://files.pythonhosted.org/packages/4e/0e/fbc9e6f69b3baf72b24695ede4bc0b855f3cf3b9d301ba833a069fdf9a74/casting_expert-0.1.7.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-12 08:18:21",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "ahmednizami",
"github_project": "casting-expert",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "casting-expert"
}