Metadata-Version: 2.1
Name: vrfy
Version: 0.2.0
Summary: Verify with VRFY: Ensure the integrity of your file copies, hash by hash!
Home-page: https://github.com/BumblebeeMan/vrfy
Author: BumblebeeMan (Dennis Koerner)
Author-email: dennis@bumblebeeman.dev
Keywords: vrfy,verify,check,directory,hash
Classifier: Development Status :: 4 - Beta
Classifier: License :: OSI Approved :: GNU General Public License v3 (GPLv3)
Classifier: Operating System :: OS Independent
Classifier: Environment :: Console
Classifier: Topic :: System :: Filesystems
Classifier: Topic :: System :: Monitoring
Classifier: Topic :: System :: Archiving
Classifier: Topic :: System :: Archiving :: Backup
Classifier: Topic :: System :: Archiving :: Mirroring
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Requires-Python: >=3.7
Description-Content-Type: text/markdown
License-File: LICENSE

﻿# Verify with VRFY: Ensure the integrity of your file copies, hash by hash!

When beginning to explore cloud storage (or other methods of remotely storing files), one may become concerned about file integrity at some point. When dealing with a large number of individual files, how can one be certain that no file becomes corrupted during the storage process, and that downloaded files remain identical to their uploaded versions? This concern is particularly relevant after undergoing multiple segmentation and encryption/decryption processes. It's impractical to manually verify all files, making file corruption a common worry, especially for those using smaller cloud providers. 

Those concerns are mitigated by using a small console application called **vrfy**. **vrfy** handles the heavy lifting and verifies the integrity of your backups for you!
With a single, easy command, you can:

1. Verify that copies are identical (i.e., all files are included in both locations, and -by using sha256 hash checksums- that not even a single bit was changed).
2. Create and store a file with sha256 hash checksums beside your data. This file can be backed up with your data and used for verification later.
3. Verify that your data is unchanged using the checksum file.

## Installation
### PIP: 
Install **vrfy** using pip:
```bash
pip install vrfy --user
```
 
## Usage
### 1. Verifing that two directories are identical
Verifing that the contents of '/path/of/clone' are identical to those of '/path/of/master'. For example, '/path/of/master' might be a local backup, whereas '/path/of/clone' might be loaded from cloud storage. 
```bash
vrfy /path/of/master /path/of/clone
```
### 2. Storing checksums for future verification
Creating a file that lists checksums for all files within a directory:
```bash
vrfy -c /path/of/data
```
Using **-r** (recursive) all in sub-directories as well:
```bash
vrfy -r -c /path/of/data
```
### 3. Verifing files against stored checksums
Verifying that all files within a directory haven't been changed (i.e., their checksums still match):
```bash
vrfy -v /path/of/data
```
Using **-r** (recursive) all sub-directories are verified as well:
```bash
vrfy -r -v /path/of/data
```
Verifying the current working directory and all its sub-directories can be done when no parameters are provided:
```bash
vrfy
```
Verifying file against an expected checksum:
```bash
vrfy -f /path/of/file/filename -cs expectedChecksum
```
### 4. Other CLI options
Display version of vrfy:
```bash
vrfy -version
```
Display checksum for given file:
```bash
vrfy -p /path/of/file/filename
```
