Metadata-Version: 2.4
Name: java-hotspots-cli
Version: 0.1.0
Summary: Internationalized CLI tool for Java hotspot and risk analysis
Author: OpenAI
License: MIT
Project-URL: Homepage, https://example.com/java-hotspots-cli
Requires-Python: >=3.10
Description-Content-Type: text/markdown
Requires-Dist: requests>=2.31.0
Requires-Dist: lizard>=1.17.10
Requires-Dist: PyYAML>=6.0.1

# java-hotspots-cli

A pip-installable CLI tool for analyzing Java hotspots using Git churn and cyclomatic complexity.  

Author : Orhan Cavus.  
Github : [https://github.com/orhancavus](https://github.com/orhancavus)

## Install

From the project directory:

```bash
pip install .
```

Editable install for development:

```bash
pip install -e .
```

Build a wheel:

```bash
python -m build
```

## CLI usage

```bash
java-hotspots --repos /path/to/repo1 /path/to/repo2
```

Turkish output:

```bash
java-hotspots --repos /path/to/repo1 --lang tr
```

Custom output base:

```bash
java-hotspots --repos /path/to/repo1 --out output/my_hotspots
```

This generates:

- `output/my_hotspots.md`
- `output/my_hotspots.csv`
- `output/my_hotspots.json`

## Optional LLM integration

```bash
java-hotspots --repos /path/to/repo1 --ollama --ollama-model Qwen3-Coder-30B-A3B-Instruct-lao01vllm
```

### Environment variables

Configure LLM integration with these environment variables:

```bash
export LITELLM_COMPLETION_URL=https://your-llm-service.com
export LITELLM_API_KEY=your-api-key
export SSL_CERT_FILE=/path/to/cert.pem  # optional
```

Or set them inline:

```bash
LITELLM_COMPLETION_URL=https://your-llm-service.com LITELLM_API_KEY=your-api-key java-hotspots --repos /path/to/repo1 --ollama
```

## Development layout

```text
java_hotspots_cli/
├── pyproject.toml
├── README.md
└── src/
    └── java_hotspots_cli/
        ├── __init__.py
        ├── __main__.py
        ├── cli.py
        ├── i18n.py
        └── translations/
            ├── en.json
            ├── tr.json
            ├── en.yaml
            └── tr.yaml
```
