Metadata-Version: 2.4
Name: lazyline
Version: 0.3.0
Summary: Zero-config line-level Python profiler. Profile any package or script with one command — no @profile decorators, no code changes. Automatic subprocess and multiprocessing support.
Keywords: profiler,profiling,line-profiler,line-profiling,python-profiler,performance,benchmark,optimization,bottleneck,zero-config,timing,tracing,subprocess-profiling,multiprocessing,cli,developer-tools,debugging,code-optimization
Author: Tomáš Venkrbec
Author-email: Tomáš Venkrbec <venkrbec.tomas@gmail.com>
License-Expression: MIT
Classifier: Development Status :: 4 - Beta
Classifier: Environment :: Console
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: 3.14
Classifier: Topic :: Software Development :: Debuggers
Classifier: Topic :: Software Development :: Quality Assurance
Classifier: Topic :: Software Development :: Testing
Classifier: Topic :: System :: Benchmark
Classifier: Topic :: Utilities
Classifier: Typing :: Typed
Requires-Dist: typer>=0.15
Requires-Dist: click>=8.0
Requires-Dist: line-profiler>=4.1.0
Requires-Dist: pygments>=2.14.0
Requires-Python: >=3.10
Project-URL: Homepage, https://github.com/TomasVenkrbec/lazyline
Project-URL: Repository, https://github.com/TomasVenkrbec/lazyline
Project-URL: Documentation, https://github.com/TomasVenkrbec/lazyline/blob/main/docs/usage.md
Project-URL: Changelog, https://github.com/TomasVenkrbec/lazyline/blob/main/CHANGELOG.md
Project-URL: Issues, https://github.com/TomasVenkrbec/lazyline/issues
Description-Content-Type: text/markdown

# Lazyline: Zero-Config Line-Level Python Profiler

[![PyPI version](https://img.shields.io/pypi/v/lazyline)](https://pypi.org/project/lazyline/)
[![Downloads](https://img.shields.io/pypi/dm/lazyline)](https://pypi.org/project/lazyline/)
[![Python versions](https://img.shields.io/pypi/pyversions/lazyline)](https://pypi.org/project/lazyline/)
[![Tests](https://github.com/TomasVenkrbec/lazyline/actions/workflows/ci.yml/badge.svg)](https://github.com/TomasVenkrbec/lazyline/actions/workflows/ci.yml)
[![codecov](https://codecov.io/gh/TomasVenkrbec/lazyline/graph/badge.svg)](https://codecov.io/gh/TomasVenkrbec/lazyline)
[![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/TomasVenkrbec/lazyline/blob/main/LICENSE)

**Zero-config, deterministic, line-level Python profiler.**
Find slow Python code and profile it line by line — no `@profile`
decorators, no code changes. Point it at a package or script and get
exact hit counts and timing for every line. Subprocesses and
multiprocessing pools profiled automatically. Find the lazy lines.

## Quick Start

```bash
pip install lazyline
# or: uvx lazyline

# Profile a package while running its tests:
lazyline run my_package -- pytest tests/

# Profile a script:
lazyline run script.py -- python script.py
```

![Lazyline demo](assets/demo.gif)

Line 14 burned 99.8% of `deduplicate` checking membership in a list
on every iteration — that's your lazy line. Change `seen` to a `set`
and it drops from O(n²) to O(n).

> [!TIP]
> Using an AI coding assistant? Install the
> [lazyline plugin](#claude-code-plugin) for Claude Code, or copy
> [`skills/lazyline/SKILL.md`](skills/lazyline/SKILL.md) into any
> assistant that supports markdown skill files.

## Key Features

Need to find performance bottlenecks in your Python code without
modifying a single file? Lazyline wraps
[line_profiler](https://github.com/pyutils/line_profiler) and adds
everything needed to go from "I want to profile this package" to
"here are the bottlenecks" in a single command:

- **Zero configuration** — point at a package name, directory, or
  `.py` file. Every function is discovered and instrumented
  automatically. No `@profile` decorators, no code changes — be lazy,
  let the tool do the work.

- **Subprocess and multiprocessing** — `ProcessPoolExecutor`,
  `multiprocessing.Pool`, and child Python processes (Celery workers,
  Airflow tasks) are profiled automatically. Results are merged
  into a single report.

- **Deterministic precision** — exact hit counts and timing for every
  line, not statistical estimates. "This line ran 47,382 times and
  took 3.2s." When you need to distinguish O(n) from O(n²), exact
  counts are the difference.

- **Focused scope, clean output** — you choose exactly which package
  to profile. Unlike tools that profile everything in your working
  directory, lazyline keeps output relevant and overhead contained.

## When to Use What

No tool is best for everything. Pick the right one for the job:

| You need... | Use | Why |
| ------------- | ----- | ----- |
| Exact line-level timing across a package, no code changes | **[lazyline](https://github.com/TomasVenkrbec/lazyline)** | Deterministic tracing with auto-discovery and subprocess support |
| Low-overhead profiling with memory, GPU, and AI suggestions | **[Scalene](https://github.com/plasma-umass/scalene)** | Sampling (~10-20% overhead), broad feature set, web UI |
| Attach to a running process in production | **[py-spy](https://github.com/benfred/py-spy)** | Out-of-process sampling, near-zero overhead, no restart needed |
| "Which function is slow?" with beautiful call trees | **[Pyinstrument](https://github.com/joerick/pyinstrument)** | Statistical profiler, tree output, low overhead |

### Feature comparison

| Feature | lazyline | kernprof | Scalene | py-spy | Pyinstrument | cProfile |
| --------- | ---------- | ---------- | --------- | -------- | -------------- | ---------- |
| Granularity | Line | Line | Line | Line | Function | Function |
| Method | Deterministic | Deterministic | Sampling | Sampling | Sampling | Deterministic |
| Code changes needed | None | `@profile` | None | None | None | None |
| Exact hit counts | Yes | Yes | No | No | No | Yes (fn-level) |
| Subprocess profiling | Automatic | No | Partial | Yes | No | No |
| Multiprocessing pools | Automatic | No | Partial | Yes | No | No |
| Memory profiling | Opt-in | No | Built-in | No | No | No |
| GPU profiling | No | No | Yes | No | No | No |
| Overhead | 1.2–7x | 1.2–7x | ~10–20% | ~0% | Low | Moderate |

**Lazyline trades overhead for precision.** Deterministic tracing fires
a callback on every line execution. For functions with real work (>0.1ms
per call), overhead is negligible (~1.2x). For tight loops calling tiny
functions millions of times, it can reach ~7x. Relative rankings are
always reliable — use lazyline to find *which* code is lazy, not to
measure *how fast* it runs. See
[benchmarks](https://github.com/TomasVenkrbec/lazyline/blob/main/benchmarks/README.md)
for detailed measurements.

## Usage

```bash
# Profile a package during its test suite
lazyline run my_package -- pytest tests/

# Profile while running a script
lazyline run my_package -- python evaluate.py

# Profile a CLI tool (hyphenated console scripts work too)
lazyline run my_package -- my-tool run-all

# Export results, view later
lazyline run -o results.json my_package -- pytest tests/
lazyline show results.json --top 10

# Multiple scopes in one run
lazyline run utils.py my_package -- python script.py
```

Requires Python 3.10+. The target package must be importable in the
same environment.

See the
[full usage guide](https://github.com/TomasVenkrbec/lazyline/blob/main/docs/usage.md)
for all CLI options, scope formats, command resolution, output details,
and more examples.

## Claude Code Plugin

Lazyline ships as a [Claude Code plugin](https://docs.anthropic.com/en/docs/claude-code/plugins).
Install it and Claude will know how to profile your code, interpret
results, and suggest optimizations:

```bash
/plugin marketplace add TomasVenkrbec/lazyline
/plugin install lazyline@lazyline
```

Then use `/lazyline my_package -- python main.py` or let Claude invoke
it automatically when you ask about performance.

The skill also works with any AI coding assistant that supports
markdown skill files — copy
[`skills/lazyline/SKILL.md`](skills/lazyline/SKILL.md)
into your assistant's configuration.

## Documentation

- **[Usage Guide][docs-usage]** — CLI reference, scope formats,
  output details
- **[How It Works][docs-how]** — architecture, overhead, limitations
- **[Benchmarks][docs-bench]** — overhead measurements and methodology
- **[Contributing][docs-contrib]** — development setup, tests,
  code style
- **[Changelog][docs-changelog]**

[docs-usage]: https://github.com/TomasVenkrbec/lazyline/blob/main/docs/usage.md
[docs-how]: https://github.com/TomasVenkrbec/lazyline/blob/main/docs/how-it-works.md
[docs-bench]: https://github.com/TomasVenkrbec/lazyline/blob/main/benchmarks/README.md
[docs-contrib]: https://github.com/TomasVenkrbec/lazyline/blob/main/CONTRIBUTING.md
[docs-changelog]: https://github.com/TomasVenkrbec/lazyline/blob/main/CHANGELOG.md

---

If lazyline helped you find a bottleneck, consider giving it a
[star](https://github.com/TomasVenkrbec/lazyline) — it helps others
discover the project. Found a problem?
[Open an issue](https://github.com/TomasVenkrbec/lazyline/issues).

## License

MIT — see [LICENSE](https://github.com/TomasVenkrbec/lazyline/blob/main/LICENSE).
