Metadata-Version: 2.4
Name: scope-forensics
Version: 1.0.0
Summary: Scope is an Open Source Cloud Forensics tool for AWS. Scope can rapidly obtain logs, discover resources, and create super timelines for analysis.
Author-email: Scope <scopeforensics@protonmail.com>
Project-URL: Homepage, https://github.com/scope-forensics/scope
Project-URL: Bug Tracker, https://github.com/scope-forensics/scope/issues
Classifier: Programming Language :: Python :: 3
Classifier: Operating System :: OS Independent
Requires-Python: >=3.6
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: boto3>=1.24.0
Requires-Dist: botocore>=1.27.0
Dynamic: license-file

# Scope - Cloud Forensics Tool

Scope is an Open Source Cloud Forensics tool for AWS. Scope can rapidly obtain logs, discover resources, and create super timelines for analysis.

## Features

- **AWS CloudTrail Collection**: Retrieve logs from S3 buckets or via the Management Events API
- **Normalized Timeline**: Convert cloud logs into a standardized timeline format
- **Multiple Export Formats**: Export timelines as CSV or JSON
- **Resource Discovery**: Identify available CloudTrail trails and AWS resources in your account
- **Credential Reports**: Generate and analyze IAM credential reports for security assessment

## Installation

### Using pip (Recommended)

```bash
pip install scope-forensics
```

### From Source

```bash
# Clone the repository
git clone https://github.com/scope-forensics/scope.git
cd scope

# Install the package
pip install .

# For development (editable mode)
pip install -e .
```

## Usage

### Basic Commands

```bash
# Display help information
scope --help

# List available commands
scope aws --help
```

### AWS Authentication

Scope supports multiple authentication methods:

1. **Interactive configuration**:
   ```bash
   # Configure AWS credentials interactively
   scope aws configure
   
   # Configure for a specific profile
   scope aws configure --profile my-profile
   ```

2. **Command-line arguments**:
   ```bash
   scope aws --access-key YOUR_ACCESS_KEY --secret-key YOUR_SECRET_KEY --region us-east-1 discover
   ```

3. **Environment variables**:
   ```bash
   # Windows
   set AWS_ACCESS_KEY_ID=your_access_key
   set AWS_SECRET_ACCESS_KEY=your_secret_key
   set AWS_DEFAULT_REGION=us-east-1
   
   # macOS/Linux
   export AWS_ACCESS_KEY_ID=your_access_key
   export AWS_SECRET_ACCESS_KEY=your_secret_key
   export AWS_DEFAULT_REGION=us-east-1
   ```

4. **AWS credentials file** (`~/.aws/credentials`)
5. **IAM role** (if running on an EC2 instance with an IAM role)

### Setting Up AWS Permissions

To use Scope effectively, you'll need an AWS user with appropriate permissions. Here's how to create one:

1. **Sign in to the AWS Management Console** and open the IAM console.

2. **Create a new policy**:
   - Go to "Policies" and click "Create policy"
   - Use the JSON editor and paste the following policy:
   ```json
   {
       "Version": "2012-10-17",
       "Statement": [
           {
               "Effect": "Allow",
               "Action": [
                   "cloudtrail:LookupEvents",
                   "cloudtrail:DescribeTrails",
                   "s3:GetObject",
                   "s3:ListBucket",
                   "s3:GetBucketLocation",
                   "ec2:DescribeInstances",
                   "iam:ListUsers",
                   "iam:ListRoles",
                   "iam:GenerateCredentialReport",
                   "iam:GetCredentialReport",
                   "lambda:ListFunctions",
                   "rds:DescribeDBInstances"
               ],
               "Resource": "*"
           }
       ]
   }
   ```
   - Name the policy "ScopeForensicsPolicy" and create it

3. **Create a new user**:
   - Go to "Users" and click "Add users"
   - Enter a username (e.g., "scope-forensics")
   - Select "Access key - Programmatic access"
   - Click "Next: Permissions"
   - Select "Attach existing policies directly"
   - Search for and select the "ScopeForensicsPolicy" you created
   - Complete the user creation process

4. **Save the credentials**:
   - Download or copy the Access Key ID and Secret Access Key
   - Use these credentials with the `scope aws configure` command

> **Note**: Consider using more restrictive permissions by limiting the "Resource" section to specific S3 buckets and CloudTrail trails.

### Discover CloudTrail Trails

To list all available CloudTrail trails in your AWS account:

```bash
scope aws discover
```

This command will display information about each trail, including its name, S3 bucket location, and whether it logs management events.

### Discover AWS Resources

To discover various AWS resources in your account (EC2, S3, IAM, Lambda, RDS):

```bash
# Discover all supported resource types
scope aws discover-resources

# Discover specific resource types
scope aws discover-resources --resource-types ec2 s3 --format json --output-file resources.json
```

Available parameters:
- `--resource-types`: Types of resources to discover (choices: ec2, s3, iam_users, iam_roles, lambda, rds, all)
- `--regions`: Specific AWS regions to search (space-separated)
- `--output-file`: Path to save the output
- `--format`: Output format (choices: json, csv, terminal)

### Explore S3 Bucket Structure

To explore the structure of an S3 bucket and automatically detect CloudTrail logs:

```bash
scope aws explore-bucket --bucket your-cloudtrail-bucket
```

This command will:
1. List top-level prefixes in the bucket
2. Automatically detect potential CloudTrail log paths
3. Provide a ready-to-use command for collecting logs from the detected paths

### Discover AWS Resources

To discover AWS resources in your account:

```bash
# Discover all supported resource types
scope aws discover-resources

# Discover specific resource types
scope aws discover-resources --resource-types lambda rds --regions us-east-1 us-west-2
```

Available parameters:
- `--resource-types`: Types of resources to discover (choices: ec2, s3, iam_users, iam_roles, lambda, rds, all)
- `--regions`: Specific AWS regions to search (space-separated)
- `--output-file`: Path to save the output
- `--format`: Output format (choices: json, csv, terminal)

### Generate IAM Credential Report

To generate and retrieve an IAM credential report:

```bash
# Display credential report in terminal
scope aws credential-report

# Save credential report as CSV
scope aws credential-report --format csv --output-file credentials.csv

# Save credential report as JSON
scope aws credential-report --format json --output-file credentials.json
```

Available parameters:
- `--output-file`: Path to save the output
- `--format`: Output format (choices: json, csv, terminal)

The credential report includes details about IAM users such as:
- Password and access key usage
- MFA status
- Access key rotation dates
- Last activity timestamps

### Collect Management Events

To collect CloudTrail management events:

```bash
scope aws management --days 7 --output-file timeline.csv --format csv
```

Available parameters:
- `--days`: Number of days to look back (default: 7)
- `--output-file`: Path to save the timeline (required)
- `--format`: Choose between 'csv' or 'json' (default: csv)

### Collect from S3

To collect CloudTrail logs stored in an S3 bucket:

```bash
scope aws s3 --bucket your-cloudtrail-bucket --output-file timeline.csv
```

The command will automatically:
1. Discover the CloudTrail log structure in your bucket
2. Identify all available regions
3. Collect logs from all regions for the specified time period

For more control, you can specify additional parameters:

```bash
scope aws s3 --bucket your-cloudtrail-bucket --prefix AWSLogs/123456789012/CloudTrail/ --regions us-east-1 us-west-2 --start-date 2023-04-15 --end-date 2023-04-22 --output-dir ./raw_logs --output-file timeline.csv --format json
```

Available parameters:
- `--bucket`: S3 bucket containing CloudTrail logs (required)
- `--prefix`: S3 prefix to filter logs (optional)
- `--regions`: Specific regions to collect from (space-separated list)
- `--start-date`: Start date in YYYY-MM-DD format (default: 7 days ago)
- `--end-date`: End date in YYYY-MM-DD format (default: today)
- `--output-dir`: Directory to save raw logs (optional)
- `--output-file`: Path to save the timeline (required)
- `--format`: Choose between 'csv' or 'json' (default: csv)

### Collect from Local Files

To process CloudTrail logs that have already been downloaded to your local machine:

```bash
scope aws local --directory /path/to/logs --output-file timeline.csv
```

For recursive processing of all subdirectories:

```bash
scope aws local --directory /path/to/logs --recursive --output-file timeline.csv --format json
```

> **Note for Windows users**: When specifying file paths, use one of these formats:
> - Forward slashes: `C:/Users/username/Desktop/CloudTrail`
> - Escaped backslashes: `C:\\Users\\username\\Desktop\\CloudTrail`
> - Quoted paths: `"C:\Users\username\Desktop\CloudTrail"`

Available parameters:
- `--directory`: Directory containing CloudTrail logs (required)
- `--recursive`: Process subdirectories recursively
- `--output-file`: Path to save the timeline (required)
- `--format`: Choose between 'csv' or 'json' (default: csv)

This command will:
1. Find all CloudTrail log files (`.json` or `.json.gz`) in the specified directory
2. Parse and normalize the events
3. Create a standardized timeline in the specified format

### Exporting Timelines

By default, Scope exports timelines to the specified output file. You can specify betwen csv and json formats.
