Metadata-Version: 2.1
Name: agentmake
Version: 0.0.39
Summary: ToolMate-SDK: a software developement kit for developing agentic AI applications that support 14 AI backends and work with 7 agentic components, such as tools and agents.. (Developer: Eliran Wong)
Home-page: https://github.com/eliranwong/agentmake
Author: Eliran Wong
Author-email: support@toolmate.ai
License: GNU General Public License (GPL)
Project-URL: Source, https://github.com/eliranwong/agentmake
Project-URL: Tracker, https://github.com/eliranwong/agentmake/issues
Project-URL: Documentation, https://github.com/eliranwong/agentmake/wiki
Project-URL: Funding, https://www.paypal.me/toolmate
Keywords: toolmate ai sdk anthropic azure chatgpt cohere deepseek genai github googleai groq llamacpp mistral ollama openai vertexai xai
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: End Users/Desktop
Classifier: Topic :: Utilities
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Software Development :: Build Tools
Classifier: License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Requires-Python: >=3.8, <3.13
License-File: LICENSE
Requires-Dist: anthropic>=0.45.2
Requires-Dist: beautifulsoup4
Requires-Dist: cohere>=5.13.11
Requires-Dist: groq>=0.17.0
Requires-Dist: importlib-metadata
Requires-Dist: markitdown
Requires-Dist: mistralai>=1.5.0
Requires-Dist: ollama>=0.4.7
Requires-Dist: openai>=1.61.0
Requires-Dist: packaging
Requires-Dist: prompt-toolkit
Requires-Dist: pygments
Requires-Dist: pyperclip
Requires-Dist: python-dotenv
Requires-Dist: tiktoken>=0.8.0
Provides-Extra: genai
Requires-Dist: google-genai>=0.8.0; extra == "genai"

# AgentMake AI

AgentMake AI: a software developement kit for developing agentic AI applications that support 14 AI backends and work with 7 agentic components, such as tools and agents. (Developer: Eliran Wong)

Supported backends: anthropic, azure, cohere, custom, deepseek, genai, github, googleai, groq, llamacpp, mistral, ollama, openai, vertexai, xai

# Audio Introduction

[![Watch the video](https://img.youtube.com/vi/JyJxrvrJyqM/maxresdefault.jpg)](https://youtu.be/JyJxrvrJyqM)

[9-min introduction](https://youtu.be/JyJxrvrJyqM) [24-min introduction](https://youtu.be/NMmuuWm2ixY)

# Sibling Projects

This SDK incorporates the best aspects of our favorite projects, [LetMeDoIt AI](https://github.com/eliranwong/letmedoit), [Toolmate AI](https://github.com/eliranwong/toolmate) and [TeamGen AI](https://github.com/eliranwong/teamgenai), to create a library aimed at further advancing the development of agentic AI applications.

# Supported backends

`anthropic` - [Anthropic API](https://console.anthropic.com/)

`azure` - [Azure OpenAI API](https://learn.microsoft.com/en-us/azure/ai-services/openai/reference)

`cohere` - [Cohere API](https://docs.cohere.com/docs/the-cohere-platform)

`custom` - any openai-compatible backends that support function calling

`deepseek` - [DeepSeek API](https://platform.deepseek.com/)

`genai` - [Vertex AI](https://cloud.google.com/vertex-ai) or [Google AI](https://ai.google.dev/)

`github` - [Github API](https://docs.github.com/en/github-models/prototyping-with-ai-models#experimenting-with-ai-models-using-the-api)

`googleai` - [Google AI](https://ai.google.dev/)

`groq` - [Groq Cloud API](https://console.groq.com)

`llamacpp` - [Llama.cpp Server](https://github.com/ggerganov/llama.cpp) - [locat setup](https://github.com/ggerganov/llama.cpp/blob/master/docs/build.md) required

`mistral` - [Mistral API](https://console.mistral.ai/api-keys/)

`ollama` - [Ollama](https://ollama.com/) - [local setup](https://ollama.com/download) required

`openai` - [OpenAI API](https://platform.openai.com/)

`vertexai` - [Vertex AI](https://cloud.google.com/vertex-ai)

`xai` - [XAI API](https://x.ai/api)

For simplicity, `agentmake` uses `ollama` as the default backend, if parameter `backend` is not specified. Ollama models are automatically downloaded if they have not already been downloaded. Users can change the default backend by modifying environment variable `DEFAULT_AI_BACKEND`.

# Introducing Agentic Components

`agentmake` is designed to work with seven kinds of components for building agentic applications:

1. `system` - System messages are crucial for guiding how AI agents interact with users.

2. `context` - Predefined instructions that are added to users' prompts as prefixes, before they are passed to the AI models.

3. `input_content_plugin` - Input content plugins process or transform user inputs before they are passed to the AI models.

4. `output_content_plugin` - Output content plugins process or transform assistant responses after they are generated by AI models.

5. `tool` - Tools take simple structured actions in response to users' requests, with the use of `schema` and `function calling`.

6. `agent` - Agents are agentic applications automate multiple-step actions or decisions, to fulfill complicated requests.  They can be executed on their own or integrated into an agentic workflow, supported by `agentmake`, to work collaboratively with other agents or components.

7. `follow_up_prompt` - Predefined prompts that are helpful for automating a series of follow-up responses after the first assistant response is generated.

# Built-in and Custom Agentic Components

`agentmake` supports both built-in agentic components, created by our developers or contributors, and cutoms agentic components, created by users to meet their own needs.

## Built-in Agentic Components

Built-in agents components are placed into the following six folders inside the `agentmake` folders:

`agents`, `contexts`, `plugins`, `prompts`, `systems`, `tools`

To use the built-in components, you only need to specify the component filenames, without parent paths or file extensions, when you run the `agentmake` signature function or CLI options.

## Custom Agentic Components

`agentmake` offers two options for users to use their custom components.

Option 1: Specify the full file path of inidividual components

Given the fact that each component can be organised as a single file, to use their own custom components, users only need to specify the file paths of the components they want to use, when they run the `agentmake` signature function or CLI options.

Option 2: Place custom components into `agentmake` user directory

The default `agentmake` user directory is `~/agentmake`, i.e. a folder named `agentmake`, created under user's home directory. Uses may define their own path by modifying the environment variable `AGENTMAKE_USER_DIR`.

After creating a folder named `agentmake` under user directory, create six sub-folders in it, according to the following names and place your custom components in relevant folders, as we do with our built-in components.

If you organize the custom agentic components in this way, you only need to specify the component filenames, without parent paths or file extensions, when you run the `agentmake` signature function or CLI options.

# Installation

Basic:

> pip install --upgrade agentmake

Basic installation supports all AI backends mentioned above, except for `vertexai`.

Extras:

We support Vertex AI via [Google GenAI SDK](https://pypi.org/project/google-genai/).  As this package supports most platforms, except for Android Termux, we separate this package `google-genai` as an extra.  To support Vertex AI with `agentmake`, install with running:

> pip install --upgrade agentmake[genai]

# Usage

This SDK is designed to offer a single signature function `agentmake` for interacting with all AI backends, delivering a unified experience for generating AI responses. The main APIs are provided with the function `agentmake` located in this [file](https://github.com/eliranwong/agentmake/blob/main/agentmake/__init__.py#L54).

Find documentation at https://github.com/eliranwong/agentmake/blob/main/docs/README.md

# Examples

The following examples assumes [Ollama](https://ollama.com/) is [installed](https://ollama.com/download) as the default backend.

To import:

> from agentmake import agentmake

To run, e.g.:

> agentmake("What is AI?")

To work with parameter `tool`, e.g.:

> agentmake("What is ToolMate AI?", tool="search_google")

> agentmake("How many 'r's are there in the word 'strawberry'?", tool="magic")

> agentmake("What time is it right now?", tool="magic")

> agentmake("Open github.com in a web browser.", tool="magic")

> agentmake("Convert file 'music.wav' into mp3 format.", tool="magic")

> agentmake("Send an email to Eliran Wong at eliran.wong@domain.com to express my gratitude for his work.", tool="send_gmail")

To work with parameters `input_content_plugin` and `output_content_plugin`, e.g.:

> agentmake("what AI model best", input_content_plugin="improve_writing", output_content_plugin="translate_into_chinese", stream=True)

To work with parameter `system`, `context`, `follow_up_prompt`, e.g.:

> agentmake("Is it better to drink wine in the morning, afternoon, or evening?", context="reflect", stream=True)

> agentmake("Is it better to drink wine in the morning, afternoon, or evening?", context="think", follow_up_prompt=["review", "refine"], stream=True)

> agentmake("Provide a detailed introduction to generative AI.", system=["create_agents", "assign_agents"], follow_up_prompt="Who is the best agent to contribute next?", stream=True, model="llama3.3:70b")

To work with parameter `agent`, e.g.:

> agentmake("Write detailed comments about the works of William Shakespeare, focusing on his literary contributions, dramatic techniques, and the profound impact he has had on the world of literature and theatre.", agent="teamgenai", stream=True, model="llama3.3:70b")

To work collaboratively with different backends, e.g.

> messages = agentmake("What is the most effective method for training AI models?", backend="openai")

> messages = agentmake(messages, backend="googleai", follow_up_prompt="Can you give me some different options?")

> messages = agentmake(messages, backend="xai", follow_up_prompt="What are the limitations or potential biases in this information?")

> agentmake(messages, backend="mistral", follow_up_prompt="Please provide a summary of the discussion so far.")


As you may see, the `agentmake` function returns the `messages` list, which is passed to the next `agentmake` function in turns.

Therefore, it is very simple to create a chatbot application, you can do it as few as five lines or less, e.g.:


> messages = [{"role": "system", "content": "You are an AI assistant."}]

> user_input = "Hello!"

> while user_input:

>     messages = agentmake(messages, follow_up_prompt=user_input, stream=True)

>     user_input = input("Enter your query:\n(enter a blank entry to exit)\n>>> ")

These are just a few simple and straightforward examples.  You may find more examples at:

https://github.com/eliranwong/agentmake/tree/main/agentmake/examples

# CLI Options

Command CLI are designed for quick run of AI features.

Check for CLI options, run:

> agentmark -h

Two shortcut commands:

`ai` == `agentmake`

`aic` == `agentmake -c` with chat features enabled

The available CLI options use the same parameter names as the `agentmake` function for AI backend configurations, to offer users a unified experience. Below are some CLI examples, that are equivalent to some of the examples mentioned above:

> ai What is AI?

> ai What is ToolMate AI --tool search_google

> ai Convert file music.wav into mp3 format. --tool task

> ai Send an email to Eliran Wong at eliran.wong@domain.com to express my gratitude for his work --tool send_gmail

> ai what AI model best --input_content_plugin improve_writing --output_content_plugin translate_into_chinese

> ai Is it better to drink wine in the morning, afternoon, or evening? --context think --follow_up_prompt review --follow_up_prompt refine

> ai Write detailed comments about the works of William Shakespeare, focusing on his literary contributions, dramatic techniques, and the profound impact he has had on the world of literature and theatre --agent teamgenai --model "llama3.3:70b"

# AI Backends Configurations

To use `ollama` as the default backend, you need to [download and install](https://ollama.com/download) Ollama. To use backends other than Ollama, you need to use your own API keys.  There are a few options you may configure the AI backends to work with `agentmake`.

## Option 1 - Use the `agentmake` function

Specify AI backend configurations as [parameters](https://github.com/eliranwong/agentmake/tree/main/docs#usage) when you run the `agentmake` signature function `agentmake`.

Setting configurations via option 1 overrides the default configurations set by option 2 and option 3, but the overriding is effective only when you run the function, with the specified configurations. Default configurations described below in option 2 and 3 still apply next time when you run the `agentmake` function, without specifying the AI backend parameters. This gives you flexibility to specify different settings in addition to the default ones.

## Option 2 - Export individual environment variables

You may manually export individual environment variables listed in https://github.com/eliranwong/agentmake/blob/main/agentmake.env

## Option 3 - Export default environment variables once for all

You may edit a copy of `agentmake.env`, e.g.

```
cd agentmake
cp agentmake.env .env
etextedit .env
```

The changes apply next time when you run `agentmake` function or cli.

Alternately, use built-in `agentmake` cli option to edit the variables:

> agentmake -ec

This command automatically make a copy of `agentmake.env` and save it as `.env` if it does not exist. Remember to save your changes before exiting the text editor to make the changes effective.

Remarks:

1. Please do not edit the file `agentmake.env` directly, as it is restored to its default values upon each upgrade.  It is recommended to make a copy of it and edit the copied file.
2. Multiple API keys are supported for running backends `cohere`, `github`, `groq` and `mistral`. You may configure API keys for these backend in the `.env` file by using commas `,` as separators, e.g. `COHERE_API_KEY=cohere_api_key_1,cohere_api_key_2,cohere_api_key_3`

# TODO

* add documentation about tool creation
* add examples
* convert availble ToolMate AI tools into tools that runable with this SDK
* add built-in system messages
* add built-in predefined contexts
* add built-in prompts


