timesleuth


Nametimesleuth JSON
Version 1.0.0 PyPI version JSON
download
home_pagehttps://github.com/MohammadNazmulDev/TimeSleuth
SummaryDigital forensics timestamp analysis library for detecting suspicious activity
upload_time2025-08-08 02:11:13
maintainerNone
docs_urlNone
authorMohammad Nazmul
requires_python>=3.7
licenseNone
keywords forensics digital-forensics timestamp security incident-response malware-analysis cybersecurity
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # TimeSleuth

**Uncover digital footprints through timestamp forensics**

TimeSleuth is a lightweight Python library for digital forensics investigators, security researchers, and system administrators who need to analyze file timestamp patterns to detect suspicious activity, reconstruct incident timelines, and identify evidence tampering.

## Core Purpose

Detect timestamp anomalies that indicate:
- System compromise and malware activity
- Evidence tampering and anti-forensics attempts  
- Unauthorized file access patterns
- Data exfiltration events
- Timeline reconstruction for incident response

## Key Features

TimeSleuth provides comprehensive timestamp analysis through five main capabilities:

**Anomaly Detection** - Identify impossible timestamp combinations and suspicious patterns
**Timeline Reconstruction** - Build chronological activity maps from file metadata
**Backdating Detection** - Find files with artificially altered timestamps
**Batch Activity Analysis** - Detect automated/scripted file operations
**Comprehensive Reporting** - Generate forensics-ready reports in JSON/CSV formats

## Installation and Basic Usage

Install TimeSleuth using pip:

```bash
pip install timesleuth
```

## Complete Usage Guide

### 1. Directory Scanning for Anomalies

The primary function for detecting timestamp anomalies across files in a directory:

```python
import timesleuth as ts

# Basic directory scan
anomalies = ts.scan_directory("/evidence/")

# Advanced scanning options
anomalies = ts.scan_directory(
    directory_path="/evidence/",
    recursive=True,              # Include subdirectories
    include_hidden=True,         # Include hidden files
    file_extensions=['.log', '.txt', '.exe']  # Filter by extensions
)

# Process results
for anomaly in anomalies:
    print(f"File: {anomaly['path']}")
    print(f"Issue: {anomaly['reason']}")
    print(f"Severity: {anomaly['severity']}")
    print(f"Details: {anomaly['details']}")
    print("---")
```

### 2. Timeline Creation and Reconstruction  

Build chronological timelines from file timestamp data:

```python
# Create basic chronological timeline
timeline = ts.create_timeline("/evidence/", output_format='list', sort_by='modified')

# Alternative timeline formats
access_timeline = ts.create_timeline("/evidence/", sort_by='accessed')
grouped_timeline = ts.create_timeline("/evidence/", output_format='grouped')

# Comprehensive activity reconstruction
activity_data = ts.reconstruct_activity("/evidence/", analysis_type='comprehensive')

# Process timeline data
for activity in timeline:
    print(f"{activity['timestamp']}: {activity['file_path']}")
    print(f"  Type: {activity['type']}, Size: {activity['file_size']} bytes")

# Access activity groups for burst analysis
if 'activity_groups' in activity_data:
    for group in activity_data['activity_groups']['groups']:
        print(f"Activity burst: {group['activity_count']} files in {group['duration_seconds']} seconds")
```

### 3. Backdated File Detection

Identify files with potentially manipulated timestamps:

```python
# Find files backdated beyond threshold
backdated = ts.find_backdated_files(
    directory_path="/evidence/",
    threshold_days=30,           # Files older than 30 days are suspicious
    reference_time=None          # Use current time as reference
)

# Use specific reference time
backdated = ts.find_backdated_files(
    "/evidence/",
    reference_time="2024-01-15T10:00:00Z",
    threshold_days=7
)

# Process backdated files
for file_info in backdated:
    print(f"Backdated file: {file_info['path']}")
    print(f"Age: {file_info['age_days']} days")
    print(f"Suspicious reasons: {', '.join(file_info['reasons'])}")
    print(f"Timestamps: {file_info['timestamps']}")
```

### 4. Advanced Pattern Analysis

Detect sophisticated timestamp manipulation patterns:

```python
from pathlib import Path

# Get list of files for analysis
files = list(Path("/evidence/").rglob("*"))

# Analyze timestamp patterns
patterns = ts.analyze_timestamp_patterns(
    file_list=files,
    detect_batch_ops=True,       # Find batch operations
    detect_time_clustering=True  # Find artificial clustering
)

# Process batch operations
for batch in patterns['batch_operations']:
    print(f"Batch operation detected:")
    print(f"  Files: {batch['file_count']}")
    print(f"  Duration: {batch['duration_seconds']} seconds")
    print(f"  Time range: {batch['start_time']} to {batch['end_time']}")

# Process time clusters
for cluster in patterns['time_clusters']:
    print(f"Suspicious clustering in {cluster['timestamp_type']} timestamps")
    print(f"  Entropy: {cluster['entropy']:.2f} (lower = more suspicious)")
    print(f"  Severity: {cluster['severity']}")
```

### 5. Report Generation

Create comprehensive forensic reports in multiple formats:

```python
# Combine all analysis results
investigation_data = {
    'anomalies': ts.scan_directory("/evidence/", recursive=True),
    'timeline': ts.create_timeline("/evidence/"),
    'backdated_files': ts.find_backdated_files("/evidence/"),
    'file_count': len(list(Path("/evidence/").rglob("*")))
}

# Generate JSON report
ts.generate_forensics_report(
    scan_results=investigation_data,
    output_path="forensic_report.json",
    report_format='json',
    include_metadata=True
)

# Generate CSV report for spreadsheet analysis
ts.generate_forensics_report(
    investigation_data,
    "forensic_report.csv", 
    report_format='csv'
)

# Create specialized timeline report
timeline_data = ts.reconstruct_activity("/evidence/", analysis_type='comprehensive')
ts.create_timeline_report(timeline_data, "timeline_report.json", format_type='json')

# Export raw data in different formats
ts.export_to_json(investigation_data, "raw_data.json", pretty_print=True)
if investigation_data['anomalies']:
    ts.export_to_csv(investigation_data['anomalies'], "anomalies.csv")
```

### 6. Incident Investigation Workflow

Complete forensic investigation process:

```python
def investigate_incident(evidence_path, output_dir="investigation"):
    """Complete forensic timestamp investigation"""
    
    # Phase 1: Initial anomaly scan
    print("Phase 1: Scanning for anomalies...")
    anomalies = ts.scan_directory(evidence_path, recursive=True, include_hidden=True)
    
    # Phase 2: Timeline reconstruction
    print("Phase 2: Reconstructing timeline...")
    timeline = ts.reconstruct_activity(evidence_path, analysis_type='comprehensive')
    
    # Phase 3: Backdated file detection
    print("Phase 3: Finding backdated files...")
    backdated = ts.find_backdated_files(evidence_path, threshold_days=90)
    
    # Phase 4: Pattern analysis
    print("Phase 4: Analyzing patterns...")
    files = list(Path(evidence_path).rglob("*"))
    patterns = ts.analyze_timestamp_patterns(files)
    
    # Phase 5: Generate comprehensive report
    print("Phase 5: Generating reports...")
    complete_findings = {
        'anomalies': anomalies,
        'timeline': timeline,
        'backdated_files': backdated,
        'patterns': patterns,
        'file_count': len(files)
    }
    
    # Create output directory
    Path(output_dir).mkdir(exist_ok=True)
    
    # Generate reports
    ts.generate_forensics_report(complete_findings, f"{output_dir}/investigation_report.json")
    ts.create_timeline_report(timeline, f"{output_dir}/timeline.json")
    
    if anomalies:
        ts.export_to_csv(anomalies, f"{output_dir}/anomalies.csv")
    
    print(f"Investigation complete. Reports in {output_dir}/")
    return complete_findings

# Run investigation
results = investigate_incident("/path/to/evidence")
```

## Detection Capabilities

TimeSleuth detects various types of timestamp anomalies and suspicious patterns:

**Timestamp Anomalies**
- Future timestamps that exceed current system time
- Pre-1980 timestamps indicating file corruption or manipulation
- Identical timestamp sets across multiple files (mass tampering)
- Creation time occurring after modification time (impossible sequence)
- Round timestamp values ending in :00 seconds (manual tampering indicator)
- Access time occurring before modification time

**Activity Patterns**  
- Batch file operations indicating automated scripts or malware activity
- Time clustering showing artificial grouping of file operations
- Off-hours activity bursts suggesting unauthorized access
- High-intensity activity periods with unusual file operation rates
- Suspicious file type patterns in timestamp modifications

**Evidence Tampering Detection**
- Files with suspiciously old timestamps relative to content
- Timestamp relationship violations between different timestamp types
- Low entropy timestamp distributions indicating artificial patterns
- Files sharing identical timestamp signatures across multiple attributes

## API Reference

### Primary Functions

**scan_directory(directory_path, recursive=True, include_hidden=False, file_extensions=None)**

Scan directory for timestamp anomalies.

Parameters:
- `directory_path` (str): Path to directory to scan
- `recursive` (bool): Include subdirectories in scan
- `include_hidden` (bool): Include hidden files in analysis  
- `file_extensions` (list): Filter files by extensions (e.g., ['.log', '.exe'])

Returns: List of anomaly dictionaries with path, reason, severity, timestamps, and details

**create_timeline(directory_path, output_format='list', sort_by='modified')**

Generate chronological timeline from file timestamps.

Parameters:
- `directory_path` (str): Directory to analyze
- `output_format` (str): Output format ('list' for chronological, 'grouped' for time windows)
- `sort_by` (str): Timestamp type for sorting ('modified', 'accessed', 'created', 'all')

Returns: Timeline data structure with chronologically ordered file activities

**find_backdated_files(directory_path, reference_time=None, threshold_days=30)**

Identify files with potentially backdated timestamps.

Parameters:
- `directory_path` (str): Directory to scan for backdated files
- `reference_time` (str): ISO format reference timestamp (default: current time)
- `threshold_days` (int): Age threshold in days for flagging files as suspicious

Returns: List of backdated file information with paths, ages, and suspicious reasons

### Advanced Functions

**analyze_timestamp_patterns(file_list, detect_batch_ops=True, detect_time_clustering=True)**

Analyze timestamp patterns across multiple files for sophisticated detection.

Parameters:
- `file_list` (list): List of file paths to analyze
- `detect_batch_ops` (bool): Enable batch operation detection
- `detect_time_clustering` (bool): Enable time clustering analysis

Returns: Dictionary with batch operations, time clusters, and suspicious patterns

**reconstruct_activity(directory_path, analysis_type='comprehensive')**

Reconstruct complete file system activity from timestamps.

Parameters:
- `directory_path` (str): Path to directory for activity reconstruction
- `analysis_type` (str): Analysis depth ('basic', 'comprehensive', 'suspicious_only')

Returns: Comprehensive activity analysis with timelines, patterns, and statistics

### Reporting Functions

**generate_forensics_report(scan_results, output_path, report_format='json', include_metadata=True)**

Generate comprehensive forensic investigation report.

**export_to_json(data, output_path, pretty_print=True)**

Export analysis data to JSON format with timestamp serialization.

**export_to_csv(data, output_path, fieldnames=None)**

Export analysis results to CSV format for spreadsheet analysis.

**create_timeline_report(timeline_data, output_path, format_type='json')**

Generate specialized timeline reports for temporal analysis.

### Utility Functions

**get_file_timestamps(file_path)**

Extract all available timestamps from a file (creation, modification, access times).

**format_timestamp(dt, format_str='%Y-%m-%d %H:%M:%S UTC')**

Format datetime objects for consistent display across reports.

## Technical Architecture

**Pure Python Implementation**
- No external dependencies beyond Python standard library
- Cross-platform compatibility (Linux, Windows, macOS)
- Optimized for large directory structures with incremental processing
- Memory efficient file handling for forensic investigations

**Performance Characteristics**
- Processes thousands of files efficiently using generator patterns
- Minimal memory footprint through streaming analysis
- Concurrent timestamp extraction where possible
- Structured data outputs for integration with forensic tools

**Standards Compliance**
- ISO 8601 timestamp formatting throughout
- Structured JSON output compatible with forensic analysis tools
- CSV exports compatible with spreadsheet applications
- UTF-8 encoding for international file path support

## Project Structure

The TimeSleuth library is organized into focused modules:

```
timesleuth/
├── __init__.py              # Main API exports and version info
├── core.py                  # Primary scanning and detection functions
├── anomaly_detection.py     # Anomaly detection algorithms and pattern analysis
├── timeline.py              # Timeline reconstruction and activity analysis
├── utils.py                 # Timestamp utilities and validation functions
├── reporting.py             # Report generation and data export functions
└── examples/
    ├── basic_usage.py           # Core functionality demonstration
    └── forensics_investigation.py # Complete investigation workflow
```

## Educational Applications

TimeSleuth serves as an excellent educational tool for:

**Digital Forensics Training**
- Understanding file system timestamp behavior
- Learning evidence tampering detection techniques
- Practicing timeline reconstruction methods
- Developing forensic analysis workflows

**Cybersecurity Education**
- Incident response skill development
- Malware analysis timestamp patterns
- Anti-forensics technique recognition
- Automated threat detection principles

**System Administration**
- File integrity monitoring concepts
- System activity analysis methods
- Security audit trail examination
- Compliance monitoring implementations

## Contributing

Contributions are welcome through GitHub pull requests. Areas for improvement include:
- Additional timestamp anomaly detection algorithms
- Performance optimizations for very large datasets
- New output format support (XML, database integration)
- Enhanced pattern recognition capabilities
- Expanded cross-platform timestamp handling

## License

MIT License - See LICENSE file for complete terms.

TimeSleuth provides professional-grade timestamp forensics capabilities in a simple, accessible Python library designed for forensic investigators, security researchers, and system administrators.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/MohammadNazmulDev/TimeSleuth",
    "name": "timesleuth",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "forensics digital-forensics timestamp security incident-response malware-analysis cybersecurity",
    "author": "Mohammad Nazmul",
    "author_email": "mohammadnazmuldev@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/23/4f/8e440f315bc57003f53887b1158429445b2c2263c136f954465cfc5fcc29/timesleuth-1.0.0.tar.gz",
    "platform": null,
    "description": "# TimeSleuth\n\n**Uncover digital footprints through timestamp forensics**\n\nTimeSleuth is a lightweight Python library for digital forensics investigators, security researchers, and system administrators who need to analyze file timestamp patterns to detect suspicious activity, reconstruct incident timelines, and identify evidence tampering.\n\n## Core Purpose\n\nDetect timestamp anomalies that indicate:\n- System compromise and malware activity\n- Evidence tampering and anti-forensics attempts  \n- Unauthorized file access patterns\n- Data exfiltration events\n- Timeline reconstruction for incident response\n\n## Key Features\n\nTimeSleuth provides comprehensive timestamp analysis through five main capabilities:\n\n**Anomaly Detection** - Identify impossible timestamp combinations and suspicious patterns\n**Timeline Reconstruction** - Build chronological activity maps from file metadata\n**Backdating Detection** - Find files with artificially altered timestamps\n**Batch Activity Analysis** - Detect automated/scripted file operations\n**Comprehensive Reporting** - Generate forensics-ready reports in JSON/CSV formats\n\n## Installation and Basic Usage\n\nInstall TimeSleuth using pip:\n\n```bash\npip install timesleuth\n```\n\n## Complete Usage Guide\n\n### 1. Directory Scanning for Anomalies\n\nThe primary function for detecting timestamp anomalies across files in a directory:\n\n```python\nimport timesleuth as ts\n\n# Basic directory scan\nanomalies = ts.scan_directory(\"/evidence/\")\n\n# Advanced scanning options\nanomalies = ts.scan_directory(\n    directory_path=\"/evidence/\",\n    recursive=True,              # Include subdirectories\n    include_hidden=True,         # Include hidden files\n    file_extensions=['.log', '.txt', '.exe']  # Filter by extensions\n)\n\n# Process results\nfor anomaly in anomalies:\n    print(f\"File: {anomaly['path']}\")\n    print(f\"Issue: {anomaly['reason']}\")\n    print(f\"Severity: {anomaly['severity']}\")\n    print(f\"Details: {anomaly['details']}\")\n    print(\"---\")\n```\n\n### 2. Timeline Creation and Reconstruction  \n\nBuild chronological timelines from file timestamp data:\n\n```python\n# Create basic chronological timeline\ntimeline = ts.create_timeline(\"/evidence/\", output_format='list', sort_by='modified')\n\n# Alternative timeline formats\naccess_timeline = ts.create_timeline(\"/evidence/\", sort_by='accessed')\ngrouped_timeline = ts.create_timeline(\"/evidence/\", output_format='grouped')\n\n# Comprehensive activity reconstruction\nactivity_data = ts.reconstruct_activity(\"/evidence/\", analysis_type='comprehensive')\n\n# Process timeline data\nfor activity in timeline:\n    print(f\"{activity['timestamp']}: {activity['file_path']}\")\n    print(f\"  Type: {activity['type']}, Size: {activity['file_size']} bytes\")\n\n# Access activity groups for burst analysis\nif 'activity_groups' in activity_data:\n    for group in activity_data['activity_groups']['groups']:\n        print(f\"Activity burst: {group['activity_count']} files in {group['duration_seconds']} seconds\")\n```\n\n### 3. Backdated File Detection\n\nIdentify files with potentially manipulated timestamps:\n\n```python\n# Find files backdated beyond threshold\nbackdated = ts.find_backdated_files(\n    directory_path=\"/evidence/\",\n    threshold_days=30,           # Files older than 30 days are suspicious\n    reference_time=None          # Use current time as reference\n)\n\n# Use specific reference time\nbackdated = ts.find_backdated_files(\n    \"/evidence/\",\n    reference_time=\"2024-01-15T10:00:00Z\",\n    threshold_days=7\n)\n\n# Process backdated files\nfor file_info in backdated:\n    print(f\"Backdated file: {file_info['path']}\")\n    print(f\"Age: {file_info['age_days']} days\")\n    print(f\"Suspicious reasons: {', '.join(file_info['reasons'])}\")\n    print(f\"Timestamps: {file_info['timestamps']}\")\n```\n\n### 4. Advanced Pattern Analysis\n\nDetect sophisticated timestamp manipulation patterns:\n\n```python\nfrom pathlib import Path\n\n# Get list of files for analysis\nfiles = list(Path(\"/evidence/\").rglob(\"*\"))\n\n# Analyze timestamp patterns\npatterns = ts.analyze_timestamp_patterns(\n    file_list=files,\n    detect_batch_ops=True,       # Find batch operations\n    detect_time_clustering=True  # Find artificial clustering\n)\n\n# Process batch operations\nfor batch in patterns['batch_operations']:\n    print(f\"Batch operation detected:\")\n    print(f\"  Files: {batch['file_count']}\")\n    print(f\"  Duration: {batch['duration_seconds']} seconds\")\n    print(f\"  Time range: {batch['start_time']} to {batch['end_time']}\")\n\n# Process time clusters\nfor cluster in patterns['time_clusters']:\n    print(f\"Suspicious clustering in {cluster['timestamp_type']} timestamps\")\n    print(f\"  Entropy: {cluster['entropy']:.2f} (lower = more suspicious)\")\n    print(f\"  Severity: {cluster['severity']}\")\n```\n\n### 5. Report Generation\n\nCreate comprehensive forensic reports in multiple formats:\n\n```python\n# Combine all analysis results\ninvestigation_data = {\n    'anomalies': ts.scan_directory(\"/evidence/\", recursive=True),\n    'timeline': ts.create_timeline(\"/evidence/\"),\n    'backdated_files': ts.find_backdated_files(\"/evidence/\"),\n    'file_count': len(list(Path(\"/evidence/\").rglob(\"*\")))\n}\n\n# Generate JSON report\nts.generate_forensics_report(\n    scan_results=investigation_data,\n    output_path=\"forensic_report.json\",\n    report_format='json',\n    include_metadata=True\n)\n\n# Generate CSV report for spreadsheet analysis\nts.generate_forensics_report(\n    investigation_data,\n    \"forensic_report.csv\", \n    report_format='csv'\n)\n\n# Create specialized timeline report\ntimeline_data = ts.reconstruct_activity(\"/evidence/\", analysis_type='comprehensive')\nts.create_timeline_report(timeline_data, \"timeline_report.json\", format_type='json')\n\n# Export raw data in different formats\nts.export_to_json(investigation_data, \"raw_data.json\", pretty_print=True)\nif investigation_data['anomalies']:\n    ts.export_to_csv(investigation_data['anomalies'], \"anomalies.csv\")\n```\n\n### 6. Incident Investigation Workflow\n\nComplete forensic investigation process:\n\n```python\ndef investigate_incident(evidence_path, output_dir=\"investigation\"):\n    \"\"\"Complete forensic timestamp investigation\"\"\"\n    \n    # Phase 1: Initial anomaly scan\n    print(\"Phase 1: Scanning for anomalies...\")\n    anomalies = ts.scan_directory(evidence_path, recursive=True, include_hidden=True)\n    \n    # Phase 2: Timeline reconstruction\n    print(\"Phase 2: Reconstructing timeline...\")\n    timeline = ts.reconstruct_activity(evidence_path, analysis_type='comprehensive')\n    \n    # Phase 3: Backdated file detection\n    print(\"Phase 3: Finding backdated files...\")\n    backdated = ts.find_backdated_files(evidence_path, threshold_days=90)\n    \n    # Phase 4: Pattern analysis\n    print(\"Phase 4: Analyzing patterns...\")\n    files = list(Path(evidence_path).rglob(\"*\"))\n    patterns = ts.analyze_timestamp_patterns(files)\n    \n    # Phase 5: Generate comprehensive report\n    print(\"Phase 5: Generating reports...\")\n    complete_findings = {\n        'anomalies': anomalies,\n        'timeline': timeline,\n        'backdated_files': backdated,\n        'patterns': patterns,\n        'file_count': len(files)\n    }\n    \n    # Create output directory\n    Path(output_dir).mkdir(exist_ok=True)\n    \n    # Generate reports\n    ts.generate_forensics_report(complete_findings, f\"{output_dir}/investigation_report.json\")\n    ts.create_timeline_report(timeline, f\"{output_dir}/timeline.json\")\n    \n    if anomalies:\n        ts.export_to_csv(anomalies, f\"{output_dir}/anomalies.csv\")\n    \n    print(f\"Investigation complete. Reports in {output_dir}/\")\n    return complete_findings\n\n# Run investigation\nresults = investigate_incident(\"/path/to/evidence\")\n```\n\n## Detection Capabilities\n\nTimeSleuth detects various types of timestamp anomalies and suspicious patterns:\n\n**Timestamp Anomalies**\n- Future timestamps that exceed current system time\n- Pre-1980 timestamps indicating file corruption or manipulation\n- Identical timestamp sets across multiple files (mass tampering)\n- Creation time occurring after modification time (impossible sequence)\n- Round timestamp values ending in :00 seconds (manual tampering indicator)\n- Access time occurring before modification time\n\n**Activity Patterns**  \n- Batch file operations indicating automated scripts or malware activity\n- Time clustering showing artificial grouping of file operations\n- Off-hours activity bursts suggesting unauthorized access\n- High-intensity activity periods with unusual file operation rates\n- Suspicious file type patterns in timestamp modifications\n\n**Evidence Tampering Detection**\n- Files with suspiciously old timestamps relative to content\n- Timestamp relationship violations between different timestamp types\n- Low entropy timestamp distributions indicating artificial patterns\n- Files sharing identical timestamp signatures across multiple attributes\n\n## API Reference\n\n### Primary Functions\n\n**scan_directory(directory_path, recursive=True, include_hidden=False, file_extensions=None)**\n\nScan directory for timestamp anomalies.\n\nParameters:\n- `directory_path` (str): Path to directory to scan\n- `recursive` (bool): Include subdirectories in scan\n- `include_hidden` (bool): Include hidden files in analysis  \n- `file_extensions` (list): Filter files by extensions (e.g., ['.log', '.exe'])\n\nReturns: List of anomaly dictionaries with path, reason, severity, timestamps, and details\n\n**create_timeline(directory_path, output_format='list', sort_by='modified')**\n\nGenerate chronological timeline from file timestamps.\n\nParameters:\n- `directory_path` (str): Directory to analyze\n- `output_format` (str): Output format ('list' for chronological, 'grouped' for time windows)\n- `sort_by` (str): Timestamp type for sorting ('modified', 'accessed', 'created', 'all')\n\nReturns: Timeline data structure with chronologically ordered file activities\n\n**find_backdated_files(directory_path, reference_time=None, threshold_days=30)**\n\nIdentify files with potentially backdated timestamps.\n\nParameters:\n- `directory_path` (str): Directory to scan for backdated files\n- `reference_time` (str): ISO format reference timestamp (default: current time)\n- `threshold_days` (int): Age threshold in days for flagging files as suspicious\n\nReturns: List of backdated file information with paths, ages, and suspicious reasons\n\n### Advanced Functions\n\n**analyze_timestamp_patterns(file_list, detect_batch_ops=True, detect_time_clustering=True)**\n\nAnalyze timestamp patterns across multiple files for sophisticated detection.\n\nParameters:\n- `file_list` (list): List of file paths to analyze\n- `detect_batch_ops` (bool): Enable batch operation detection\n- `detect_time_clustering` (bool): Enable time clustering analysis\n\nReturns: Dictionary with batch operations, time clusters, and suspicious patterns\n\n**reconstruct_activity(directory_path, analysis_type='comprehensive')**\n\nReconstruct complete file system activity from timestamps.\n\nParameters:\n- `directory_path` (str): Path to directory for activity reconstruction\n- `analysis_type` (str): Analysis depth ('basic', 'comprehensive', 'suspicious_only')\n\nReturns: Comprehensive activity analysis with timelines, patterns, and statistics\n\n### Reporting Functions\n\n**generate_forensics_report(scan_results, output_path, report_format='json', include_metadata=True)**\n\nGenerate comprehensive forensic investigation report.\n\n**export_to_json(data, output_path, pretty_print=True)**\n\nExport analysis data to JSON format with timestamp serialization.\n\n**export_to_csv(data, output_path, fieldnames=None)**\n\nExport analysis results to CSV format for spreadsheet analysis.\n\n**create_timeline_report(timeline_data, output_path, format_type='json')**\n\nGenerate specialized timeline reports for temporal analysis.\n\n### Utility Functions\n\n**get_file_timestamps(file_path)**\n\nExtract all available timestamps from a file (creation, modification, access times).\n\n**format_timestamp(dt, format_str='%Y-%m-%d %H:%M:%S UTC')**\n\nFormat datetime objects for consistent display across reports.\n\n## Technical Architecture\n\n**Pure Python Implementation**\n- No external dependencies beyond Python standard library\n- Cross-platform compatibility (Linux, Windows, macOS)\n- Optimized for large directory structures with incremental processing\n- Memory efficient file handling for forensic investigations\n\n**Performance Characteristics**\n- Processes thousands of files efficiently using generator patterns\n- Minimal memory footprint through streaming analysis\n- Concurrent timestamp extraction where possible\n- Structured data outputs for integration with forensic tools\n\n**Standards Compliance**\n- ISO 8601 timestamp formatting throughout\n- Structured JSON output compatible with forensic analysis tools\n- CSV exports compatible with spreadsheet applications\n- UTF-8 encoding for international file path support\n\n## Project Structure\n\nThe TimeSleuth library is organized into focused modules:\n\n```\ntimesleuth/\n\u251c\u2500\u2500 __init__.py              # Main API exports and version info\n\u251c\u2500\u2500 core.py                  # Primary scanning and detection functions\n\u251c\u2500\u2500 anomaly_detection.py     # Anomaly detection algorithms and pattern analysis\n\u251c\u2500\u2500 timeline.py              # Timeline reconstruction and activity analysis\n\u251c\u2500\u2500 utils.py                 # Timestamp utilities and validation functions\n\u251c\u2500\u2500 reporting.py             # Report generation and data export functions\n\u2514\u2500\u2500 examples/\n    \u251c\u2500\u2500 basic_usage.py           # Core functionality demonstration\n    \u2514\u2500\u2500 forensics_investigation.py # Complete investigation workflow\n```\n\n## Educational Applications\n\nTimeSleuth serves as an excellent educational tool for:\n\n**Digital Forensics Training**\n- Understanding file system timestamp behavior\n- Learning evidence tampering detection techniques\n- Practicing timeline reconstruction methods\n- Developing forensic analysis workflows\n\n**Cybersecurity Education**\n- Incident response skill development\n- Malware analysis timestamp patterns\n- Anti-forensics technique recognition\n- Automated threat detection principles\n\n**System Administration**\n- File integrity monitoring concepts\n- System activity analysis methods\n- Security audit trail examination\n- Compliance monitoring implementations\n\n## Contributing\n\nContributions are welcome through GitHub pull requests. Areas for improvement include:\n- Additional timestamp anomaly detection algorithms\n- Performance optimizations for very large datasets\n- New output format support (XML, database integration)\n- Enhanced pattern recognition capabilities\n- Expanded cross-platform timestamp handling\n\n## License\n\nMIT License - See LICENSE file for complete terms.\n\nTimeSleuth provides professional-grade timestamp forensics capabilities in a simple, accessible Python library designed for forensic investigators, security researchers, and system administrators.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Digital forensics timestamp analysis library for detecting suspicious activity",
    "version": "1.0.0",
    "project_urls": {
        "Bug Reports": "https://github.com/MohammadNazmulDev/TimeSleuth/issues",
        "Documentation": "https://github.com/MohammadNazmulDev/TimeSleuth#readme",
        "Homepage": "https://github.com/MohammadNazmulDev/TimeSleuth",
        "Source Code": "https://github.com/MohammadNazmulDev/TimeSleuth"
    },
    "split_keywords": [
        "forensics",
        "digital-forensics",
        "timestamp",
        "security",
        "incident-response",
        "malware-analysis",
        "cybersecurity"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "a6bfac81bc43691ab3a99fdbeb326c382aee9e85d8994cb1e90aa32934411ec3",
                "md5": "e8d7c4c67bae3377d551fbe9b654e509",
                "sha256": "488e4231c069aee70dbc31007786e41dfc740e289b9c9f32e378011f5e0aaf46"
            },
            "downloads": -1,
            "filename": "timesleuth-1.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e8d7c4c67bae3377d551fbe9b654e509",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 20836,
            "upload_time": "2025-08-08T02:11:12",
            "upload_time_iso_8601": "2025-08-08T02:11:12.364802Z",
            "url": "https://files.pythonhosted.org/packages/a6/bf/ac81bc43691ab3a99fdbeb326c382aee9e85d8994cb1e90aa32934411ec3/timesleuth-1.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "234f8e440f315bc57003f53887b1158429445b2c2263c136f954465cfc5fcc29",
                "md5": "f16d98d690a58195f93985d5e5629b07",
                "sha256": "56f8d226fc59652bf85cb5dcc2fdc67a34dbb76404f2d91eedeb316662028e42"
            },
            "downloads": -1,
            "filename": "timesleuth-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "f16d98d690a58195f93985d5e5629b07",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 28498,
            "upload_time": "2025-08-08T02:11:13",
            "upload_time_iso_8601": "2025-08-08T02:11:13.704727Z",
            "url": "https://files.pythonhosted.org/packages/23/4f/8e440f315bc57003f53887b1158429445b2c2263c136f954465cfc5fcc29/timesleuth-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-08 02:11:13",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "MohammadNazmulDev",
    "github_project": "TimeSleuth",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "timesleuth"
}
        
Elapsed time: 0.91327s