Metadata-Version: 2.1
Name: gradefast
Version: 0.1.3
Summary: A framework that takes care of boundary tasks to ease task evaluations and make them faster.
Home-page: https://github.com/eyantra-eysip/GradeFast-2019
Author: Omkar Manjrekar
Author-email: manjrekarom@gmail.com
License: MIT
Platform: UNKNOWN
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: Implementation :: PyPy
Requires-Python: >=3.5.0
Description-Content-Type: text/markdown
Requires-Dist: numpy
Requires-Dist: requests
Requires-Dist: beautifulsoup4
Requires-Dist: pandas
Requires-Dist: halo


# GradeFast-2019

!['Gradefast Logo'](assets/gradefast-logo.png)

Gradefast is a framework whose intention is to make eYRC evaluations faster and bring a standardization across task evaluations. Most important benefit is that the test scripts can then be extended and be reused in other themes having similar tasks. Primary thought behind making GradeFast is to take care of all the boundary tasks that happen in theme evaluations and also make utility scripts that take care of other common tasks so that a lot of time is saved. Some of these tasks are downloading new submissions, iterating over each of the submission folders, easily finding files to test, comments based on criterias, uploading marks and plagiarism checking. The framework is also easy to use causing less to no cognitive load on a theme developer using it.

## Installing gradefast

### Installing from pip
`$ pip install gradefast`

**OR** 

### Installing from sources
1. `$ git clone https://github.com/eyantra-eysip/GradeFast-2019`
2. `$ cd GradeFast-2019`
3. `$ git checkout develop`
4. `$ pip install .`

## Features

1. Submission downloading
2. Plagiarism checking
3. Primarily built for python but can support evaluation of any other tasks
4. Aggregate results
5. Upload marks
6. Add comments

## Also includes

1. Detailed error logging
2. Timing and static code analysis of scripts
3. Resume tasks from where they are stopped
4. Checking test conventions
5. Boilerplate code generation

## Future work

- [ ] Statistics and analysis of results
- [ ] Sandboxed environments
- [ ] Multi-programming

## Running tests

1. For unit tests: ``python -m unittest discover -s tests/unit/``
2. For integration tests: ``python -m unittest discover -s tests/unit/``

## Checking coverage

1. Install coverage.py: `pip install coverage`
2. Run `coverage run --source=gradefast/ -m unittest discover -s tests/unit/` or
   `coverage run --source=gradefast/ -m unittest discover -s tests/integration/`
3. Generate static html `coverage html`
4. `cd htmlcov` and run http server with `live-server` or `python3 -m http.server`

## Building and running documentation

### Installing required packages

1. ``$ pip install sphinx_ustack_theme``
2. ``$ pip install --upgrade sphinx``

### Building docs

1. ``$ cd docs``
2. ``$ make html``

### Running web server to host documentation

1. ``$ cd docs/_build/html``
2. Start a web server in this directory. You can use either the python 3
   HTTP server or [live-server](https://www.npmjs.com/package/live-server).

   ``$ python3 -m http.server``


