pds-web-analytics


Namepds-web-analytics JSON
Version 1.0.1 PyPI version JSON
download
home_pagehttps://github.com/NASA-PDS/web-analytics
SummaryPDS Web Analytics - Log processing and S3 synchronization for Planetary Data System
upload_time2025-10-08 23:53:21
maintainerNone
docs_urlNone
authorPDS
requires_python>=3.12
licenseapache-2.0
keywords pds planetary data web analytics logstash s3 aws
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # PDS Web Analytics

A comprehensive web analytics system for the Planetary Data System (PDS) that processes and analyzes web access logs from multiple PDS nodes using Logstash, OpenSearch, and AWS services.

## Overview

This system ingests web access logs from various PDS nodes (ATM, EN, GEO, IMG, NAIF, PPI, RINGS, SBN) and processes them through a Logstash pipeline to extract meaningful analytics data. The processed data is stored in OpenSearch for visualization and analysis.

### Key Features

- **Multi-format Log Processing**: Supports Apache Combined, IIS, FTP, and Tomcat log formats
- **ECS v8 Compliance**: All data is structured according to Elastic Common Schema v8
- **Comprehensive Error Handling**: Bad logs are tagged and stored separately for analysis
- **Geographic IP Resolution**: Automatic geolocation and reverse DNS lookup
- **User Agent Analysis**: Bot detection and user agent parsing
- **Test Framework**: Automated testing with sample log data
- **AWS Integration**: S3 log ingestion and OpenSearch output
- **Environment Variable Support**: Configuration via environment variables with envsubst
- **Flexible AWS Profile**: Support for AWS_PROFILE environment variable
- **Native boto3 S3 Uploads**: S3 log sync now uses boto3 (no AWS CLI required for S3 uploads)

## Architecture

```
PDS Nodes → S3 Bucket → Logstash Pipeline → OpenSearch → Dashboards
                ↓
            Error Logs → Bad Logs File
```

See internal wiki for more detailed architecture.

**NOTE:** The current practice is for PDS EN to gather the various PDS nodes' logs onto PDS reporting servers then sync those to S3. This will shift in FY2026 to each PDS node pushing their logs - logs in one of the acceptable formats and gzipped - to S3. This does not affect the architecture diagram just above but will affect much of the instructions in this README.

## Prerequisites

### System Requirements
- **Operating System**: Linux/Unix (tested on CentOS 7.9, macOS)
- **Python**: 3.12.x or higher
- **Java**: OpenJDK 11 or higher (required for Logstash)
- **Memory**: Minimum 4GB RAM (8GB+ recommended for production)
- **Storage**: 10GB+ available disk space

### AWS Infrastructure Setup

See internal wiki for more details.

### Required Software

#### 1. Python Virtual Environment
```bash
# Create a virtual environment
python3 -m venv venv

# Activate the virtual environment
# On Linux/macOS:
source venv/bin/activate
# On Windows:
venv\Scripts\activate
```

#### 2. AWS Credentials (boto3)
- The S3 sync tool now uses [boto3](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html) for all S3 operations.
- You do **not** need the AWS CLI for S3 uploads, but you must have valid AWS credentials (via `~/.aws/credentials`, environment variables, or IAM role).
- The `--aws-profile` argument or `AWS_PROFILE` environment variable can be used to select a profile.

#### 3. Logstash
```bash
# Download Logstash 8.x
wget https://artifacts.elastic.co/downloads/logstash/logstash-8.17.0-linux-x86_64.tar.gz
tar -xzf logstash-8.17.0-linux-x86_64.tar.gz
ln -s $(pwd)/logstash-8.17.1 $(pwd)/logstash

# Add to PATH
echo 'export PATH="$(pwd)/logstash/bin:$PATH"' >> ~/.bashrc
source ~/.bashrc

# Verify installation
logstash --version
```

We also need to install additional logstash plugins:
```bash
# Install tld opensearch plugins:

logstash-plugin install logstash-filter-tld
logstash-plugin install logstash-output-opensearch
```

#### 4. envsubst (for environment variable substitution)
```bash
# Verify if this is already installed
envsubst --help

# If note, install

# On Ubuntu/Debian:
sudo apt-get install gettext-base

# On CentOS/RHEL:
sudo yum install gettext

# On macOS:
brew install gettext
```

## Installation

### 1. Clone the Repository
```bash
git clone https://github.com/NASA-PDS/web-analytics.git
cd web-analytics

# Create WEB_ANALYTICS_HOME environment variable
echo 'export WEB_ANALYTICS_HOME="$(pwd)"' >> ~/.bashrc
source ~/.bashrc
```

### 2. Set Up Python Environment
```bash
# Create and activate virtual environment (using Python 3.12 or 3.13)
python3 -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install the package in development mode (dependencies will be installed automatically)
pip install -e .
```

**Note**: A legacy `environment.yml` file is provided for users who prefer conda, but the recommended approach is to use Python virtual environments with the package's setup.cfg configuration.

### 3. Configure Environment Variables
Create a `.env` file in the repository root:
```bash
# AWS Configuration
export AWS_REGION=us-west-2
export S3_BUCKET_NAME=your-pds-logs-bucket
export AOSS_URL=https://your-opensearch-domain.us-west-2.es.amazonaws.com
export INDEX_PREFIX=pds-web-analytics

# Logstash Configuration
export LS_SETTINGS_DIR=$(pwd)/config/logstash/config
```
*See internal wiki for details of how to populate this file*

### 4. Set Up Logstash Configuration
```bash
cd $WEB_ANALYTICS_HOME

# Source your config
source .env

# Run the configuration build script
./scripts/logstash_build_config.sh
```

This script will:
- Copy the pipelines template and replace the env variables to `pipelines.yml`
- Create individual pipeline configuration files for each PDS node
- Combine input, filter, and output configurations automatically

#### 5. Set Up OpenSearch

1. Log into AWS and navigate to the OpenSearch Dashboard → Dev Tools
2. Check if template already exists (ecs-web-template):
```
GET _cat/templates
```
3. If not, create the template:
```
PUT _index_template/ecs-web-template

# copy-paste from https://github.com/NASA-PDS/web-analytics/tree/main/config/opensearch/ecs-8.17-custom-template.json
```
4. Verify success
```
GET _cat/templates
```

## Package Structure

The PDS Web Analytics system is organized as a Python package:

```
src/pds/web_analytics/
├── __init__.py          # Package initialization
├── s3_sync.py          # S3Sync class implementation (now uses boto3)
└── VERSION.txt         # Package version
```

### Installing the Package

After setting up the environment, install the package in development mode:

```bash
cd $WEB_ANALYTICS_HOME

# Install in development mode
pip install -e .

# Verify installation
s3-log-sync --help
```

This makes the `s3-log-sync` command available system-wide.

## Configuration

### Logstash Configuration Structure

```
config/logstash/config/
├── inputs/                    # S3 input configurations for each PDS node
│   ├── pds-input-s3-atm.conf
│   ├── pds-input-s3-en.conf
│   ├── pds-input-s3-geo.conf
│   ├── pds-input-s3-img.conf
│   ├── pds-input-s3-naif.conf
│   ├── pds-input-s3-ppi.conf
│   ├── pds-input-s3-rings.conf
│   └── pds-input-s3-sbn.conf
├── shared/                    # Shared filter and output configurations
│   ├── pds-filter.conf       # Main processing pipeline
│   └── pds-output-opensearch.conf
├── plugins/                   # Custom plugins and patterns
│   └── regexes.yaml
├── logstash.yml              # Logstash main configuration
└── pipelines.yml.template    # Pipeline definitions
```

### S3 Log Sync Configuration

Create a configuration file based on `config/config_example.yaml`:

```yaml
s3_bucket: ${S3_BUCKET}
s3_subdir: logs
subdirs:
  data:
    logs:
      include:
        - "*"
```

The configuration supports environment variable substitution using `${VARIABLE_NAME}` syntax, which is processed by `envsubst` (still required).

## Usage

### 1. S3 Log Synchronization

**NOTE:** This step below is NOT required to be performed if you already have files in S3.

Sync logs from PDS reporting servers to S3:

```bash
cd $WEB_ANALYTICS_HOME

# Using the package command (recommended)
s3-log-sync -c config/config.yaml -d /var/log/pds

# If AWS_PROFILE environment variable is set, it will be used automatically
export AWS_PROFILE=pds-analytics
s3-log-sync -c config/config.yaml -d /var/log/pds

# Or explicitly specify the AWS profile
s3-log-sync -c config/config.yaml -d /var/log/pds --aws-profile pds-analytics

# Disable gzip compression
s3-log-sync -c config/config.yaml -d /var/log/pds --no-gzip

# Set up as a cron job (example: every hour)
0 * * * * cd /path/to/web-analytics && s3-log-sync -c config/config.yaml -d /var/log/pds
```

**Note**: The `--aws-profile` argument defaults to the `AWS_PROFILE` environment variable if it's set. If neither is provided, the command will fail with a helpful error message. All S3 uploads are performed using boto3 (not the AWS CLI).

### 2. Logstash Processing

Start Logstash with the PDS configuration:

```bash
cd $WEB_ANALYTICS_HOME

# Source the environment variables
source .env

# Pull the latest changes on the repo
git pull

# If anything changed, re-generate the pipeline configs
./scripts/logstash_build_config.sh

# Start Logstash
logstash -f ${WEB_ANALYTICS_HOME}/config/logstash/config/pipelines.yml

# To run in background
nohup $HOME/logstash/bin/logstash > $OUTPUT_LOG 2>&1&
```

### 3. Testing

Run the comprehensive test suite:

```bash
# Run unit tests
python -m pytest tests/test_s3_sync.py -v

# Run integration tests
python -m unittest tests.test_logstash_integration

# Or use the test runner script
chmod +x tests/run_tests.sh
./tests/run_tests.sh
```

The test suite validates:
- Log parsing accuracy
- Error handling
- Bad log detection
- ECS field mapping
- Output formatting
- Configuration loading with environment variables
- AWS profile handling
- **boto3 S3 upload logic**

### 4. Monitoring

Check Logstash status and logs:

```bash
# Check Logstash process
ps aux | grep logstash

# Monitor nohup logs
source $WEB_ANALYTICS_HOME/.env
tail -f $OUTPUT_LOG

# Monitor logstash logs
tail -f $LOGSTASH_HOME/logs/logstash-plain.log

# Monitor bad logs
tail -f /tmp/bad_logs_$(date +%Y-%m).txt
```

## Data Processing Overview

### Supported Log Formats

1. **Apache Combined Log Format**
   ```
   192.168.1.1 - - [25/Dec/2023:10:30:45 +0000] "GET /data/file.txt HTTP/1.1" 200 1024 "http://referrer.com" "Mozilla/5.0..."
   ```

2. **Microsoft IIS Log Format**
   ```
   2023-12-25 10:30:45 W3SVC1 192.168.1.1 GET /data/file.txt 80 - 192.168.1.100 Mozilla/5.0... 200 0 0 1024 0 15
   ```

3. **FTP Transfer Logs**
   ```
   Mon Dec 25 10:30:45 2023 1 192.168.1.1 1024 /data/file.txt a _ o r user ftp 0 * c
   ```

4. **Tomcat Access Logs**
   ```
   192.168.1.1 - - [25/Dec/2023:10:30:45 +0000] "GET /webapp/data HTTP/1.1" 200 1024
   ```

### ECS Field Mapping

The system maps log data to Elastic Common Schema v8 fields (among others):

- `[source][address]` - Client IP address
- `[url][path]` - Requested URL path
- `[http][request][method]` - HTTP method (GET, POST, etc.)
- `[http][response][status_code]` - HTTP status code
- `[http][response][body][bytes]` - Response size in bytes
- `[user_agent][original]` - User agent string
- `[event][start]` - Request timestamp
- `[organization][name]` - PDS node identifier

### Error Handling

The system handles various error conditions:

- **Bad Unicode**: Logs with invalid characters are tagged with `bad_log`
- **Parse Failures**: Unparseable logs are tagged with `_grok_parse_failure`
- **Invalid HTTP Methods**: Non-standard methods are tagged with `_invalid_http_method`
- **Missing Fields**: Logs missing required fields are tagged appropriately

All error logs are stored in `/tmp/bad_logs_YYYY-MM.txt` with detailed error information.

## PDS Node Support

The system processes logs from the following PDS nodes:

| Node | Domain | Protocol | Dataset |
|------|--------|----------|---------|
| ATM | pds-atmospheres.nmsu.edu | HTTP/FTP | atm.http, atm.ftp |
| EN | pds.nasa.gov | HTTP | en.http |
| GEO | Multiple domains | HTTP/FTP | geo.http, geo.ftp |
| IMG | pds-imaging.jpl.nasa.gov | HTTP | img.http |
| NAIF | naif.jpl.nasa.gov | HTTP/FTP | naif.http, naif.ftp |
| PPI | pds-ppi.igpp.ucla.edu | HTTP | ppi.http |
| RINGS | pds-rings.seti.org | HTTP | rings.http |
| SBN | Multiple domains | HTTP | sbn.http |

## Development

### Project Structure

```
web-analytics/
├── config/                    # Configuration files
│   ├── logstash/             # Logstash configurations
│   └── config_example.yaml   # S3 sync configuration template
├── scripts/                   # Utility scripts
│   ├── s3_log_sync.py        # S3 log synchronization
│   └── img_s3_download.py    # Image data download
├── tests/                     # Test framework
│   ├── data/logs/            # Sample log files
│   ├── config/               # Test configurations
│   └── run_tests.sh          # Test runner
├── docs/                      # Documentation
├── terraform/                 # Infrastructure as Code
└── src/                       # Source code
```

### Installation

Install in editable mode and with extra developer dependencies into your virtual environment of choice:

    pip install --editable '.[dev]'

See [the wiki entry on Secrets](https://github.com/NASA-PDS/nasa-pds.github.io/wiki/Git-and-Github-Guide#detect-secrets) to install and setup detect-secrets.

Then, configure the `pre-commit` hooks:

    pre-commit install
    pre-commit install -t pre-push
    pre-commit install -t prepare-commit-msg
    pre-commit install -t commit-msg

These hooks then will check for any future commits that might contain secrets. They also check code formatting, PEP8 compliance, type hints, etc.

👉 **Note:** A one time setup is required both to support `detect-secrets` and in your global Git configuration. See [the wiki entry on Secrets](https://github.com/NASA-PDS/nasa-pds.github.io/wiki/Git-and-Github-Guide#detect-secrets) to learn how.

### Adding New PDS Nodes

1. Create a new input configuration in `config/logstash/config/inputs/`
2. Add the node to `config/logstash/config/pipelines.yml.template`
3. Update the S3 sync configuration
4. Add test cases to the test framework
5. Update this README with node information

### Contributing

1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Add tests for new functionality
5. Run the test suite
6. Submit a pull request

## Troubleshooting

### Common Issues

1. **Logstash won't start**
   - Check Java installation: `java -version`
   - Verify configuration syntax: `logstash -t -f config_file.conf`
   - Check file permissions

2. **No data in OpenSearch**
   - Verify AWS credentials and permissions
   - Check S3 bucket access
   - Review Logstash logs for errors

3. **High memory usage**
   - Adjust `pipeline.batch.size` in `logstash.yml`
   - Reduce `pipeline.workers` if needed
   - Monitor system resources

4. **Parse failures**
   - Check log format matches expected patterns
   - Review bad logs file for specific issues
   - Update grok patterns if needed

### Log Locations

- **Logstash logs**: `/var/log/logstash/`
- **Bad logs**: `/tmp/bad_logs_YYYY-MM.txt`
- **Test output**: `target/test/`

### Performance Tuning

For production deployments:

1. **Instance sizing**: Use t3.xlarge or larger for high-volume processing
2. **Batch processing**: Adjust `pipeline.batch.size` based on memory availability
3. **Queue settings**: Configure `queue.max_bytes` and `queue.max_events`
4. **Monitoring**: Set up CloudWatch metrics for Logstash performance

## License

This project is licensed under the Apache License 2.0 - see the [LICENSE.md](LICENSE.md) file for details.

## Support

For questions and support:
- Check the [PDS Web Analytics PDF](PDS%20Web%20Analytics%20with%20Logstash%20_97cf55c410a64bbc903a13347b02ea71-260625-0752-1596.pdf) for detailed technical information
- Review the test framework for usage examples
- Contact the PDS development team

## Changelog

See [CHANGELOG.md](CHANGELOG.md) for a detailed history of changes and improvements.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/NASA-PDS/web-analytics",
    "name": "pds-web-analytics",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.12",
    "maintainer_email": null,
    "keywords": "pds, planetary data, web analytics, logstash, s3, aws",
    "author": "PDS",
    "author_email": "pds_operator@jpl.nasa.gov",
    "download_url": "https://github.com/NASA-PDS/web-analytics/releases/",
    "platform": null,
    "description": "# PDS Web Analytics\n\nA comprehensive web analytics system for the Planetary Data System (PDS) that processes and analyzes web access logs from multiple PDS nodes using Logstash, OpenSearch, and AWS services.\n\n## Overview\n\nThis system ingests web access logs from various PDS nodes (ATM, EN, GEO, IMG, NAIF, PPI, RINGS, SBN) and processes them through a Logstash pipeline to extract meaningful analytics data. The processed data is stored in OpenSearch for visualization and analysis.\n\n### Key Features\n\n- **Multi-format Log Processing**: Supports Apache Combined, IIS, FTP, and Tomcat log formats\n- **ECS v8 Compliance**: All data is structured according to Elastic Common Schema v8\n- **Comprehensive Error Handling**: Bad logs are tagged and stored separately for analysis\n- **Geographic IP Resolution**: Automatic geolocation and reverse DNS lookup\n- **User Agent Analysis**: Bot detection and user agent parsing\n- **Test Framework**: Automated testing with sample log data\n- **AWS Integration**: S3 log ingestion and OpenSearch output\n- **Environment Variable Support**: Configuration via environment variables with envsubst\n- **Flexible AWS Profile**: Support for AWS_PROFILE environment variable\n- **Native boto3 S3 Uploads**: S3 log sync now uses boto3 (no AWS CLI required for S3 uploads)\n\n## Architecture\n\n```\nPDS Nodes \u2192 S3 Bucket \u2192 Logstash Pipeline \u2192 OpenSearch \u2192 Dashboards\n                \u2193\n            Error Logs \u2192 Bad Logs File\n```\n\nSee internal wiki for more detailed architecture.\n\n**NOTE:** The current practice is for PDS EN to gather the various PDS nodes' logs onto PDS reporting servers then sync those to S3. This will shift in FY2026 to each PDS node pushing their logs - logs in one of the acceptable formats and gzipped - to S3. This does not affect the architecture diagram just above but will affect much of the instructions in this README.\n\n## Prerequisites\n\n### System Requirements\n- **Operating System**: Linux/Unix (tested on CentOS 7.9, macOS)\n- **Python**: 3.12.x or higher\n- **Java**: OpenJDK 11 or higher (required for Logstash)\n- **Memory**: Minimum 4GB RAM (8GB+ recommended for production)\n- **Storage**: 10GB+ available disk space\n\n### AWS Infrastructure Setup\n\nSee internal wiki for more details.\n\n### Required Software\n\n#### 1. Python Virtual Environment\n```bash\n# Create a virtual environment\npython3 -m venv venv\n\n# Activate the virtual environment\n# On Linux/macOS:\nsource venv/bin/activate\n# On Windows:\nvenv\\Scripts\\activate\n```\n\n#### 2. AWS Credentials (boto3)\n- The S3 sync tool now uses [boto3](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html) for all S3 operations.\n- You do **not** need the AWS CLI for S3 uploads, but you must have valid AWS credentials (via `~/.aws/credentials`, environment variables, or IAM role).\n- The `--aws-profile` argument or `AWS_PROFILE` environment variable can be used to select a profile.\n\n#### 3. Logstash\n```bash\n# Download Logstash 8.x\nwget https://artifacts.elastic.co/downloads/logstash/logstash-8.17.0-linux-x86_64.tar.gz\ntar -xzf logstash-8.17.0-linux-x86_64.tar.gz\nln -s $(pwd)/logstash-8.17.1 $(pwd)/logstash\n\n# Add to PATH\necho 'export PATH=\"$(pwd)/logstash/bin:$PATH\"' >> ~/.bashrc\nsource ~/.bashrc\n\n# Verify installation\nlogstash --version\n```\n\nWe also need to install additional logstash plugins:\n```bash\n# Install\u00a0tld opensearch plugins:\n\nlogstash-plugin install logstash-filter-tld\nlogstash-plugin install logstash-output-opensearch\n```\n\n#### 4. envsubst (for environment variable substitution)\n```bash\n# Verify if this is already installed\nenvsubst --help\n\n# If note, install\n\n# On Ubuntu/Debian:\nsudo apt-get install gettext-base\n\n# On CentOS/RHEL:\nsudo yum install gettext\n\n# On macOS:\nbrew install gettext\n```\n\n## Installation\n\n### 1. Clone the Repository\n```bash\ngit clone https://github.com/NASA-PDS/web-analytics.git\ncd web-analytics\n\n# Create WEB_ANALYTICS_HOME environment variable\necho 'export WEB_ANALYTICS_HOME=\"$(pwd)\"' >> ~/.bashrc\nsource ~/.bashrc\n```\n\n### 2. Set Up Python Environment\n```bash\n# Create and activate virtual environment (using Python 3.12 or 3.13)\npython3 -m venv venv\nsource venv/bin/activate  # On Windows: venv\\Scripts\\activate\n\n# Install the package in development mode (dependencies will be installed automatically)\npip install -e .\n```\n\n**Note**: A legacy `environment.yml` file is provided for users who prefer conda, but the recommended approach is to use Python virtual environments with the package's setup.cfg configuration.\n\n### 3. Configure Environment Variables\nCreate a `.env` file in the repository root:\n```bash\n# AWS Configuration\nexport AWS_REGION=us-west-2\nexport S3_BUCKET_NAME=your-pds-logs-bucket\nexport AOSS_URL=https://your-opensearch-domain.us-west-2.es.amazonaws.com\nexport INDEX_PREFIX=pds-web-analytics\n\n# Logstash Configuration\nexport LS_SETTINGS_DIR=$(pwd)/config/logstash/config\n```\n*See internal wiki for details of how to populate this file*\n\n### 4. Set Up Logstash Configuration\n```bash\ncd $WEB_ANALYTICS_HOME\n\n# Source your config\nsource .env\n\n# Run the configuration build script\n./scripts/logstash_build_config.sh\n```\n\nThis script will:\n- Copy the pipelines template and replace the env variables to `pipelines.yml`\n- Create individual pipeline configuration files for each PDS node\n- Combine input, filter, and output configurations automatically\n\n#### 5. Set Up OpenSearch\n\n1. Log into AWS and navigate to the OpenSearch Dashboard \u2192 Dev Tools\n2. Check if template already exists (ecs-web-template):\n```\nGET _cat/templates\n```\n3. If not, create the template:\n```\nPUT _index_template/ecs-web-template\n\n# copy-paste from https://github.com/NASA-PDS/web-analytics/tree/main/config/opensearch/ecs-8.17-custom-template.json\n```\n4. Verify success\n```\nGET _cat/templates\n```\n\n## Package Structure\n\nThe PDS Web Analytics system is organized as a Python package:\n\n```\nsrc/pds/web_analytics/\n\u251c\u2500\u2500 __init__.py          # Package initialization\n\u251c\u2500\u2500 s3_sync.py          # S3Sync class implementation (now uses boto3)\n\u2514\u2500\u2500 VERSION.txt         # Package version\n```\n\n### Installing the Package\n\nAfter setting up the environment, install the package in development mode:\n\n```bash\ncd $WEB_ANALYTICS_HOME\n\n# Install in development mode\npip install -e .\n\n# Verify installation\ns3-log-sync --help\n```\n\nThis makes the `s3-log-sync` command available system-wide.\n\n## Configuration\n\n### Logstash Configuration Structure\n\n```\nconfig/logstash/config/\n\u251c\u2500\u2500 inputs/                    # S3 input configurations for each PDS node\n\u2502   \u251c\u2500\u2500 pds-input-s3-atm.conf\n\u2502   \u251c\u2500\u2500 pds-input-s3-en.conf\n\u2502   \u251c\u2500\u2500 pds-input-s3-geo.conf\n\u2502   \u251c\u2500\u2500 pds-input-s3-img.conf\n\u2502   \u251c\u2500\u2500 pds-input-s3-naif.conf\n\u2502   \u251c\u2500\u2500 pds-input-s3-ppi.conf\n\u2502   \u251c\u2500\u2500 pds-input-s3-rings.conf\n\u2502   \u2514\u2500\u2500 pds-input-s3-sbn.conf\n\u251c\u2500\u2500 shared/                    # Shared filter and output configurations\n\u2502   \u251c\u2500\u2500 pds-filter.conf       # Main processing pipeline\n\u2502   \u2514\u2500\u2500 pds-output-opensearch.conf\n\u251c\u2500\u2500 plugins/                   # Custom plugins and patterns\n\u2502   \u2514\u2500\u2500 regexes.yaml\n\u251c\u2500\u2500 logstash.yml              # Logstash main configuration\n\u2514\u2500\u2500 pipelines.yml.template    # Pipeline definitions\n```\n\n### S3 Log Sync Configuration\n\nCreate a configuration file based on `config/config_example.yaml`:\n\n```yaml\ns3_bucket: ${S3_BUCKET}\ns3_subdir: logs\nsubdirs:\n  data:\n    logs:\n      include:\n        - \"*\"\n```\n\nThe configuration supports environment variable substitution using `${VARIABLE_NAME}` syntax, which is processed by `envsubst` (still required).\n\n## Usage\n\n### 1. S3 Log Synchronization\n\n**NOTE:** This step below is NOT required to be performed if you already have files in S3.\n\nSync logs from PDS reporting servers to S3:\n\n```bash\ncd $WEB_ANALYTICS_HOME\n\n# Using the package command (recommended)\ns3-log-sync -c config/config.yaml -d /var/log/pds\n\n# If AWS_PROFILE environment variable is set, it will be used automatically\nexport AWS_PROFILE=pds-analytics\ns3-log-sync -c config/config.yaml -d /var/log/pds\n\n# Or explicitly specify the AWS profile\ns3-log-sync -c config/config.yaml -d /var/log/pds --aws-profile pds-analytics\n\n# Disable gzip compression\ns3-log-sync -c config/config.yaml -d /var/log/pds --no-gzip\n\n# Set up as a cron job (example: every hour)\n0 * * * * cd /path/to/web-analytics && s3-log-sync -c config/config.yaml -d /var/log/pds\n```\n\n**Note**: The `--aws-profile` argument defaults to the `AWS_PROFILE` environment variable if it's set. If neither is provided, the command will fail with a helpful error message. All S3 uploads are performed using boto3 (not the AWS CLI).\n\n### 2. Logstash Processing\n\nStart Logstash with the PDS configuration:\n\n```bash\ncd $WEB_ANALYTICS_HOME\n\n# Source the environment variables\nsource .env\n\n# Pull the latest changes on the repo\ngit pull\n\n# If anything changed, re-generate the pipeline configs\n./scripts/logstash_build_config.sh\n\n# Start Logstash\nlogstash -f ${WEB_ANALYTICS_HOME}/config/logstash/config/pipelines.yml\n\n# To run in background\nnohup $HOME/logstash/bin/logstash > $OUTPUT_LOG 2>&1&\n```\n\n### 3. Testing\n\nRun the comprehensive test suite:\n\n```bash\n# Run unit tests\npython -m pytest tests/test_s3_sync.py -v\n\n# Run integration tests\npython -m unittest tests.test_logstash_integration\n\n# Or use the test runner script\nchmod +x tests/run_tests.sh\n./tests/run_tests.sh\n```\n\nThe test suite validates:\n- Log parsing accuracy\n- Error handling\n- Bad log detection\n- ECS field mapping\n- Output formatting\n- Configuration loading with environment variables\n- AWS profile handling\n- **boto3 S3 upload logic**\n\n### 4. Monitoring\n\nCheck Logstash status and logs:\n\n```bash\n# Check Logstash process\nps aux | grep logstash\n\n# Monitor nohup logs\nsource $WEB_ANALYTICS_HOME/.env\ntail -f $OUTPUT_LOG\n\n# Monitor logstash logs\ntail -f $LOGSTASH_HOME/logs/logstash-plain.log\n\n# Monitor bad logs\ntail -f /tmp/bad_logs_$(date +%Y-%m).txt\n```\n\n## Data Processing Overview\n\n### Supported Log Formats\n\n1. **Apache Combined Log Format**\n   ```\n   192.168.1.1 - - [25/Dec/2023:10:30:45 +0000] \"GET /data/file.txt HTTP/1.1\" 200 1024 \"http://referrer.com\" \"Mozilla/5.0...\"\n   ```\n\n2. **Microsoft IIS Log Format**\n   ```\n   2023-12-25 10:30:45 W3SVC1 192.168.1.1 GET /data/file.txt 80 - 192.168.1.100 Mozilla/5.0... 200 0 0 1024 0 15\n   ```\n\n3. **FTP Transfer Logs**\n   ```\n   Mon Dec 25 10:30:45 2023 1 192.168.1.1 1024 /data/file.txt a _ o r user ftp 0 * c\n   ```\n\n4. **Tomcat Access Logs**\n   ```\n   192.168.1.1 - - [25/Dec/2023:10:30:45 +0000] \"GET /webapp/data HTTP/1.1\" 200 1024\n   ```\n\n### ECS Field Mapping\n\nThe system maps log data to Elastic Common Schema v8 fields (among others):\n\n- `[source][address]` - Client IP address\n- `[url][path]` - Requested URL path\n- `[http][request][method]` - HTTP method (GET, POST, etc.)\n- `[http][response][status_code]` - HTTP status code\n- `[http][response][body][bytes]` - Response size in bytes\n- `[user_agent][original]` - User agent string\n- `[event][start]` - Request timestamp\n- `[organization][name]` - PDS node identifier\n\n### Error Handling\n\nThe system handles various error conditions:\n\n- **Bad Unicode**: Logs with invalid characters are tagged with `bad_log`\n- **Parse Failures**: Unparseable logs are tagged with `_grok_parse_failure`\n- **Invalid HTTP Methods**: Non-standard methods are tagged with `_invalid_http_method`\n- **Missing Fields**: Logs missing required fields are tagged appropriately\n\nAll error logs are stored in `/tmp/bad_logs_YYYY-MM.txt` with detailed error information.\n\n## PDS Node Support\n\nThe system processes logs from the following PDS nodes:\n\n| Node | Domain | Protocol | Dataset |\n|------|--------|----------|---------|\n| ATM | pds-atmospheres.nmsu.edu | HTTP/FTP | atm.http, atm.ftp |\n| EN | pds.nasa.gov | HTTP | en.http |\n| GEO | Multiple domains | HTTP/FTP | geo.http, geo.ftp |\n| IMG | pds-imaging.jpl.nasa.gov | HTTP | img.http |\n| NAIF | naif.jpl.nasa.gov | HTTP/FTP | naif.http, naif.ftp |\n| PPI | pds-ppi.igpp.ucla.edu | HTTP | ppi.http |\n| RINGS | pds-rings.seti.org | HTTP | rings.http |\n| SBN | Multiple domains | HTTP | sbn.http |\n\n## Development\n\n### Project Structure\n\n```\nweb-analytics/\n\u251c\u2500\u2500 config/                    # Configuration files\n\u2502   \u251c\u2500\u2500 logstash/             # Logstash configurations\n\u2502   \u2514\u2500\u2500 config_example.yaml   # S3 sync configuration template\n\u251c\u2500\u2500 scripts/                   # Utility scripts\n\u2502   \u251c\u2500\u2500 s3_log_sync.py        # S3 log synchronization\n\u2502   \u2514\u2500\u2500 img_s3_download.py    # Image data download\n\u251c\u2500\u2500 tests/                     # Test framework\n\u2502   \u251c\u2500\u2500 data/logs/            # Sample log files\n\u2502   \u251c\u2500\u2500 config/               # Test configurations\n\u2502   \u2514\u2500\u2500 run_tests.sh          # Test runner\n\u251c\u2500\u2500 docs/                      # Documentation\n\u251c\u2500\u2500 terraform/                 # Infrastructure as Code\n\u2514\u2500\u2500 src/                       # Source code\n```\n\n### Installation\n\nInstall in editable mode and with extra developer dependencies into your virtual environment of choice:\n\n    pip install --editable '.[dev]'\n\nSee [the wiki entry on Secrets](https://github.com/NASA-PDS/nasa-pds.github.io/wiki/Git-and-Github-Guide#detect-secrets) to install and setup detect-secrets.\n\nThen, configure the `pre-commit` hooks:\n\n    pre-commit install\n    pre-commit install -t pre-push\n    pre-commit install -t prepare-commit-msg\n    pre-commit install -t commit-msg\n\nThese hooks then will check for any future commits that might contain secrets. They also check code formatting, PEP8 compliance, type hints, etc.\n\n\ud83d\udc49 **Note:** A one time setup is required both to support `detect-secrets` and in your global Git configuration. See [the wiki entry on Secrets](https://github.com/NASA-PDS/nasa-pds.github.io/wiki/Git-and-Github-Guide#detect-secrets) to learn how.\n\n### Adding New PDS Nodes\n\n1. Create a new input configuration in `config/logstash/config/inputs/`\n2. Add the node to `config/logstash/config/pipelines.yml.template`\n3. Update the S3 sync configuration\n4. Add test cases to the test framework\n5. Update this README with node information\n\n### Contributing\n\n1. Fork the repository\n2. Create a feature branch\n3. Make your changes\n4. Add tests for new functionality\n5. Run the test suite\n6. Submit a pull request\n\n## Troubleshooting\n\n### Common Issues\n\n1. **Logstash won't start**\n   - Check Java installation: `java -version`\n   - Verify configuration syntax: `logstash -t -f config_file.conf`\n   - Check file permissions\n\n2. **No data in OpenSearch**\n   - Verify AWS credentials and permissions\n   - Check S3 bucket access\n   - Review Logstash logs for errors\n\n3. **High memory usage**\n   - Adjust `pipeline.batch.size` in `logstash.yml`\n   - Reduce `pipeline.workers` if needed\n   - Monitor system resources\n\n4. **Parse failures**\n   - Check log format matches expected patterns\n   - Review bad logs file for specific issues\n   - Update grok patterns if needed\n\n### Log Locations\n\n- **Logstash logs**: `/var/log/logstash/`\n- **Bad logs**: `/tmp/bad_logs_YYYY-MM.txt`\n- **Test output**: `target/test/`\n\n### Performance Tuning\n\nFor production deployments:\n\n1. **Instance sizing**: Use t3.xlarge or larger for high-volume processing\n2. **Batch processing**: Adjust `pipeline.batch.size` based on memory availability\n3. **Queue settings**: Configure `queue.max_bytes` and `queue.max_events`\n4. **Monitoring**: Set up CloudWatch metrics for Logstash performance\n\n## License\n\nThis project is licensed under the Apache License 2.0 - see the [LICENSE.md](LICENSE.md) file for details.\n\n## Support\n\nFor questions and support:\n- Check the [PDS Web Analytics PDF](PDS%20Web%20Analytics%20with%20Logstash%20_97cf55c410a64bbc903a13347b02ea71-260625-0752-1596.pdf) for detailed technical information\n- Review the test framework for usage examples\n- Contact the PDS development team\n\n## Changelog\n\nSee [CHANGELOG.md](CHANGELOG.md) for a detailed history of changes and improvements.\n",
    "bugtrack_url": null,
    "license": "apache-2.0",
    "summary": "PDS Web Analytics - Log processing and S3 synchronization for Planetary Data System",
    "version": "1.0.1",
    "project_urls": {
        "Download": "https://github.com/NASA-PDS/web-analytics/releases/",
        "Homepage": "https://github.com/NASA-PDS/web-analytics"
    },
    "split_keywords": [
        "pds",
        " planetary data",
        " web analytics",
        " logstash",
        " s3",
        " aws"
    ],
    "urls": [
        {
            "comment_text": "\ud83e\udd20 Yee-haw! This here ar-tee-fact got done uploaded by the Roundup!",
            "digests": {
                "blake2b_256": "583741d055085bbd4aed63da40376f6f5c164b1047715ab59ddd31cffb5d691e",
                "md5": "ebeafbcdb511ee3d9ba8656d53bfed78",
                "sha256": "549e4dba56b65bc2ce09adc240721c5abe5e5d8e598e2364b643bcb4e80d45d3"
            },
            "downloads": -1,
            "filename": "pds_web_analytics-1.0.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "ebeafbcdb511ee3d9ba8656d53bfed78",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.12",
            "size": 19815,
            "upload_time": "2025-10-08T23:53:21",
            "upload_time_iso_8601": "2025-10-08T23:53:21.548875Z",
            "url": "https://files.pythonhosted.org/packages/58/37/41d055085bbd4aed63da40376f6f5c164b1047715ab59ddd31cffb5d691e/pds_web_analytics-1.0.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-08 23:53:21",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "NASA-PDS",
    "github_project": "web-analytics",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "tox": true,
    "lcname": "pds-web-analytics"
}
        
PDS
Elapsed time: 0.63184s