Metadata-Version: 2.4
Name: stackone-ai
Version: 2.4.0
Summary: agents performing actions on your SaaS
Author-email: StackOne <support@stackone.com>
License-File: LICENSE
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Requires-Python: >=3.10
Requires-Dist: bm25s>=0.2.2
Requires-Dist: httpx>=0.28.0
Requires-Dist: langchain-core>=0.1.0
Requires-Dist: numpy>=1.24.0
Requires-Dist: pydantic>=2.10.6
Requires-Dist: typing-extensions>=4.0.0
Provides-Extra: examples
Requires-Dist: crewai>=0.102.0; extra == 'examples'
Requires-Dist: langchain-openai>=0.3.6; extra == 'examples'
Requires-Dist: langgraph>=0.2.0; extra == 'examples'
Requires-Dist: openai>=1.63.2; extra == 'examples'
Requires-Dist: python-dotenv>=1.0.1; extra == 'examples'
Provides-Extra: mcp
Requires-Dist: mcp>=1.3.0; extra == 'mcp'
Description-Content-Type: text/markdown

# StackOne AI SDK

[![PyPI version](https://badge.fury.io/py/stackone-ai.svg)](https://badge.fury.io/py/stackone-ai)
[![GitHub release (latest by date)](https://img.shields.io/github/v/release/StackOneHQ/stackone-ai-python)](https://github.com/StackOneHQ/stackone-ai-python/releases)
[![Coverage](https://stackonehq.github.io/stackone-ai-python/badges.svg)](https://stackonehq.github.io/stackone-ai-python/html/)
[![DeepWiki](https://img.shields.io/badge/DeepWiki-StackOneHQ%2Fstackone--ai--python-blue.svg?logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACwAAAAyCAYAAAAnWDnqAAAAAXNSR0IArs4c6QAAA05JREFUaEPtmUtyEzEQhtWTQyQLHNak2AB7ZnyXZMEjXMGeK/AIi+QuHrMnbChYY7MIh8g01fJoopFb0uhhEqqcbWTp06/uv1saEDv4O3n3dV60RfP947Mm9/SQc0ICFQgzfc4CYZoTPAswgSJCCUJUnAAoRHOAUOcATwbmVLWdGoH//PB8mnKqScAhsD0kYP3j/Yt5LPQe2KvcXmGvRHcDnpxfL2zOYJ1mFwrryWTz0advv1Ut4CJgf5uhDuDj5eUcAUoahrdY/56ebRWeraTjMt/00Sh3UDtjgHtQNHwcRGOC98BJEAEymycmYcWwOprTgcB6VZ5JK5TAJ+fXGLBm3FDAmn6oPPjR4rKCAoJCal2eAiQp2x0vxTPB3ALO2CRkwmDy5WohzBDwSEFKRwPbknEggCPB/imwrycgxX2NzoMCHhPkDwqYMr9tRcP5qNrMZHkVnOjRMWwLCcr8ohBVb1OMjxLwGCvjTikrsBOiA6fNyCrm8V1rP93iVPpwaE+gO0SsWmPiXB+jikdf6SizrT5qKasx5j8ABbHpFTx+vFXp9EnYQmLx02h1QTTrl6eDqxLnGjporxl3NL3agEvXdT0WmEost648sQOYAeJS9Q7bfUVoMGnjo4AZdUMQku50McDcMWcBPvr0SzbTAFDfvJqwLzgxwATnCgnp4wDl6Aa+Ax283gghmj+vj7feE2KBBRMW3FzOpLOADl0Isb5587h/U4gGvkt5v60Z1VLG8BhYjbzRwyQZemwAd6cCR5/XFWLYZRIMpX39AR0tjaGGiGzLVyhse5C9RKC6ai42ppWPKiBagOvaYk8lO7DajerabOZP46Lby5wKjw1HCRx7p9sVMOWGzb/vA1hwiWc6jm3MvQDTogQkiqIhJV0nBQBTU+3okKCFDy9WwferkHjtxib7t3xIUQtHxnIwtx4mpg26/HfwVNVDb4oI9RHmx5WGelRVlrtiw43zboCLaxv46AZeB3IlTkwouebTr1y2NjSpHz68WNFjHvupy3q8TFn3Hos2IAk4Ju5dCo8B3wP7VPr/FGaKiG+T+v+TQqIrOqMTL1VdWV1DdmcbO8KXBz6esmYWYKPwDL5b5FA1a0hwapHiom0r/cKaoqr+27/XcrS5UwSMbQAAAABJRU5ErkJggg==)](https://deepwiki.com/StackOneHQ/stackone-ai-python)

<!-- DeepWiki badge generated by https://deepwiki.ryoppippi.com/ -->

StackOne AI provides a unified interface for accessing various SaaS tools through AI-friendly APIs.

## Features

- Unified interface for multiple SaaS tools
- AI-friendly tool descriptions and parameters
- **Tool Calling**: Direct method calling with `tool.call()` for intuitive usage
- **MCP-backed Dynamic Discovery**: Fetch tools at runtime via `fetch_tools()` with provider, action, and account filtering
- **Advanced Tool Filtering**:
  - Glob pattern filtering with patterns like `"salesforce_*"` and exclusions `"!*_delete_*"`
  - Provider and action filtering
  - Multi-account support
- **Semantic Search**: AI-powered tool discovery using natural language queries
- **Search Tool**: Callable tool discovery for agent loops via `get_search_tool()`
- Integration with popular AI frameworks:
  - OpenAI Functions
  - LangChain Tools
  - CrewAI Tools
  - LangGraph Tool Node

## Requirements

- Python 3.10+

## Installation

```bash
pip install 'stackone-ai[mcp]'

# Or with uv
uv add 'stackone-ai[mcp]'
```

### Optional Features

```bash
# Install with CrewAI examples
pip install 'stackone-ai[mcp,examples]'
# or
uv add 'stackone-ai[mcp,examples]'
```

## Quick Start

```python
from stackone_ai import StackOneToolSet

# Initialize with API key
toolset = StackOneToolSet()  # Uses STACKONE_API_KEY env var
# Or explicitly: toolset = StackOneToolSet(api_key="your-api-key")

# Get HRIS-related tools with glob patterns
tools = toolset.fetch_tools(actions=["bamboohr_*"], account_ids=["your-account-id"])

# Use a specific tool with the call method
employee_tool = tools.get_tool("bamboohr_get_employee")
# Call with keyword arguments
employee = employee_tool.call(id="employee-id")
# Or with traditional execute method
employee = employee_tool.execute({"id": "employee-id"})
```

## Tool Filtering

StackOne AI SDK provides powerful filtering capabilities to help you select the exact tools you need.

### Filtering with `fetch_tools()`

The `fetch_tools()` method provides filtering by providers, actions, and account IDs:

```python
from stackone_ai import StackOneToolSet

toolset = StackOneToolSet()

# Filter by account IDs
tools = toolset.fetch_tools(account_ids=["acc-123", "acc-456"])

# Filter by providers (case-insensitive)
tools = toolset.fetch_tools(providers=["hibob", "bamboohr"])

# Filter by action patterns with glob support
tools = toolset.fetch_tools(actions=["*_list_employees"])

# Combine multiple filters
tools = toolset.fetch_tools(
    account_ids=["acc-123"],
    providers=["hibob"],
    actions=["*_list_*"]
)

# Use set_accounts() for chaining
toolset.set_accounts(["acc-123", "acc-456"])
tools = toolset.fetch_tools(providers=["hibob"])
```

**Filtering Options:**

- **`account_ids`**: Filter tools by account IDs. Tools will be loaded for each specified account.
- **`providers`**: Filter by provider names (e.g., `["hibob", "bamboohr"]`). Case-insensitive matching.
- **`actions`**: Filter by action patterns with glob support:
  - Exact match: `["bamboohr_list_employees"]`
  - Glob pattern: `["*_list_employees"]` matches all tools ending with `_list_employees`
  - Provider prefix: `["bamboohr_*"]` matches all BambooHR tools

## Implicit Feedback (Beta)

The Python SDK can emit implicit behavioral feedback to LangSmith so you can triage low-quality tool results without manually tagging runs.

### Automatic configuration

Set `LANGSMITH_API_KEY` in your environment and the SDK will initialize the implicit feedback manager on first tool execution. You can optionally fine-tune behavior with:

- `STACKONE_IMPLICIT_FEEDBACK_ENABLED` (`true`/`false`, defaults to `true` when an API key is present)
- `STACKONE_IMPLICIT_FEEDBACK_PROJECT` to pin a LangSmith project name
- `STACKONE_IMPLICIT_FEEDBACK_TAGS` with a comma-separated list of tags applied to every run

### Manual configuration

If you want custom session or user resolvers, call `configure_implicit_feedback` during start-up:

```python
from stackone_ai import configure_implicit_feedback

configure_implicit_feedback(
    api_key="/path/to/langsmith.key",
    project_name="stackone-agents",
    default_tags=["python-sdk"],
)
```

Providing your own `session_resolver`/`user_resolver` callbacks lets you derive identifiers from the request context before events are sent to LangSmith.

### Attaching session context to tool calls

Both `tool.execute` and `tool.call` accept an `options` keyword that is excluded from the API request but forwarded to the feedback manager:

```python
tool.execute(
    {"id": "employee-id"},
    options={
        "feedback_session_id": "chat-42",
        "feedback_user_id": "user-123",
        "feedback_metadata": {"conversation_id": "abc"},
    },
)
```

When two calls for the same session happen within a few seconds, the SDK emits a `refinement_needed` event, and you can inspect suitability scores directly in LangSmith.

## Integration Examples

<details>
<summary>LangChain Integration</summary>

StackOne tools work seamlessly with LangChain, enabling powerful AI agent workflows:

```python
from langchain_openai import ChatOpenAI
from stackone_ai import StackOneToolSet

# Initialize StackOne tools
toolset = StackOneToolSet()
tools = toolset.fetch_tools(actions=["bamboohr_*"], account_ids=["your-account-id"])

# Convert to LangChain format
langchain_tools = tools.to_langchain()

# Use with LangChain models
model = ChatOpenAI(model="gpt-4o-mini")
model_with_tools = model.bind_tools(langchain_tools)

# Execute AI-driven tool calls
response = model_with_tools.invoke("Get employee information for ID: emp123")

# Handle tool calls
for tool_call in response.tool_calls:
    tool = tools.get_tool(tool_call["name"])
    if tool:
        result = tool.execute(tool_call["args"])
        print(f"Result: {result}")
```

</details>

<details>
<summary>LangGraph Integration</summary>

StackOne tools convert to LangChain tools, which LangGraph consumes via its prebuilt nodes:

Prerequisites:

```bash
pip install langgraph langchain-openai
```

```python
from langchain_openai import ChatOpenAI
from typing import Annotated
from typing_extensions import TypedDict

from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from langgraph.prebuilt import tools_condition

from stackone_ai import StackOneToolSet
from stackone_ai.integrations.langgraph import to_tool_node, bind_model_with_tools

# Prepare tools
toolset = StackOneToolSet()
tools = toolset.fetch_tools(actions=["bamboohr_*"], account_ids=["your-account-id"])
langchain_tools = tools.to_langchain()

class State(TypedDict):
    messages: Annotated[list, add_messages]

# Build a small agent loop: LLM -> maybe tools -> back to LLM
graph = StateGraph(State)
graph.add_node("tools", to_tool_node(langchain_tools))

def call_llm(state: dict):
    llm = ChatOpenAI(model="gpt-4o-mini")
    llm = bind_model_with_tools(llm, langchain_tools)
    resp = llm.invoke(state["messages"])  # returns AIMessage with optional tool_calls
    return {"messages": state["messages"] + [resp]}

graph.add_node("llm", call_llm)
graph.add_edge(START, "llm")
graph.add_conditional_edges("llm", tools_condition)
graph.add_edge("tools", "llm")
app = graph.compile()

_ = app.invoke({"messages": [("user", "Get employee with id emp123") ]})
```

</details>

<details>
<summary>CrewAI Integration</summary>

CrewAI uses LangChain tools natively, making integration seamless:

```python
from crewai import Agent, Crew, Task
from stackone_ai import StackOneToolSet

# Get tools and convert to LangChain format
toolset = StackOneToolSet()
tools = toolset.fetch_tools(actions=["bamboohr_*"], account_ids=["your-account-id"])
langchain_tools = tools.to_langchain()

# Create CrewAI agent with StackOne tools
agent = Agent(
    role="HR Manager",
    goal="Analyze employee data and generate insights",
    backstory="Expert in HR analytics and employee management",
    tools=langchain_tools,
    llm="gpt-4o-mini"
)

# Define task and execute
task = Task(
    description="Find all employees in the engineering department",
    agent=agent,
    expected_output="List of engineering employees with their details"
)

crew = Crew(agents=[agent], tasks=[task])
result = crew.kickoff()
```

</details>

## Feedback Collection

The SDK includes a feedback collection tool (`tool_feedback`) that allows users to submit feedback about their experience with StackOne tools. This tool is automatically included in the toolset and is designed to be invoked by AI agents after user permission.

```python
from stackone_ai import StackOneToolSet

toolset = StackOneToolSet()

# Get the feedback tool (included with "tool_*" pattern or all tools)
tools = toolset.fetch_tools(actions=["tool_*"])
feedback_tool = tools.get_tool("tool_feedback")

# Submit feedback (typically invoked by AI after user consent)
result = feedback_tool.call(
    feedback="The HRIS tools are working great! Very fast response times.",
    account_id="acc_123456",
    tool_names=["bamboohr_list_employees", "bamboohr_get_employee"]
)
```

**Important**: The AI agent should always ask for user permission before submitting feedback:

- "Are you ok with sending feedback to StackOne? The LLM will take care of sending it."
- Only call the tool after the user explicitly agrees.

## Search Tool

Search for tools using natural language queries. Works with both semantic (cloud) and local BM25+TF-IDF search.

### Basic Usage

```python
# Get a callable search tool
toolset = StackOneToolSet()
all_tools = toolset.fetch_tools(account_ids=["your-account-id"])
search_tool = toolset.get_search_tool()

# Search for relevant tools — returns a Tools collection
tools = search_tool("manage employees", top_k=5)

# Execute a discovered tool directly
tools[0](limit=10)
```

## Semantic Search

Discover tools using natural language instead of exact names. Queries like "onboard new hire" resolve to the right actions even when the tool is called `bamboohr_create_employee`.

```python
from stackone_ai import StackOneToolSet

toolset = StackOneToolSet()

# Search by intent — returns Tools collection ready for any framework
tools = toolset.search_tools("manage employee records", account_ids=["your-account-id"], top_k=5)
openai_tools = tools.to_openai()

# Lightweight: inspect results without fetching full tool definitions
results = toolset.search_action_names("time off requests", top_k=5)
```

### Search Modes

Control which search backend `search_tools()` uses via the `search` parameter:

```python
# "auto" (default) — tries semantic search first, falls back to local
tools = toolset.search_tools("manage employees", search="auto")

# "semantic" — semantic API only, raises if unavailable
tools = toolset.search_tools("manage employees", search="semantic")

# "local" — local BM25+TF-IDF only, no semantic API call
tools = toolset.search_tools("manage employees", search="local")
```

Results are automatically scoped to connectors in your linked accounts. See [Semantic Search Example](examples/semantic_search_example.py) for `SearchTool` (`get_search_tool`) integration, OpenAI, and LangChain patterns.

## Examples

For more examples, check out the [examples/](examples/) directory:

- [StackOne Account IDs](examples/stackone_account_ids.py)
- [File Uploads](examples/file_uploads.py)
- [OpenAI Integration](examples/openai_integration.py)
- [LangChain Integration](examples/langchain_integration.py)
- [CrewAI Integration](examples/crewai_integration.py)
- [Search Tool](examples/search_tool_example.py)
- [Semantic Search](examples/semantic_search_example.py)

## Development

### Using Nix (Recommended)

This project includes a Nix flake for reproducible development environments. All development tools are defined in [flake.nix](./flake.nix) and provided via Nix.

#### Installing Nix

```bash
# Install Nix with flakes enabled (if not already installed)
curl --proto '=https' --tlsv1.2 -sSf -L https://artifacts.nixos.org/experimental-installer | \
  sh -s -- install

# If flakes are not enabled, enable them with:
mkdir -p ~/.config/nix && echo "experimental-features = nix-command flakes" >> ~/.config/nix/nix.conf
```

#### Activating the Development Environment

```bash
# Automatic activation with direnv (recommended)
direnv allow

# Or manual activation
nix develop
```

The Nix development environment includes:

- Python with uv package manager
- Automatic dependency installation
- Git hooks (treefmt + ty) auto-configured
- Consistent environment across all platforms

## License

Apache 2.0 License
