Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ All packages are extensions to the [UiPath Python SDK](https://github.com/UiPath

Build agents using the [LlamaIndex SDK](https://www.llamaindex.ai/):

- [README](packages/uipath-llamaindex/README.md)
- [Docs](https://uipath.github.io/uipath-python/llamaindex/quick_start/)
- [Samples](packages/uipath-llamaindex/samples/)

Expand All @@ -23,6 +24,7 @@ Build agents using the [LlamaIndex SDK](https://www.llamaindex.ai/):

Build agents using the [OpenAI Agents SDK](https://github.com/openai/openai-agents-python):

- [README](packages/uipath-openai-agents/README.md)
- [Docs](https://uipath.github.io/uipath-python/openai-agents/quick_start/)
- [Samples](packages/uipath-openai-agents/samples/)

Expand Down
123 changes: 108 additions & 15 deletions packages/uipath-openai-agents/README.md
Original file line number Diff line number Diff line change
@@ -1,32 +1,125 @@
# UiPath OpenAI Agents SDK
# UiPath OpenAI Agents Python SDK

Build intelligent AI agents with OpenAI's Agents framework and UiPath.
[![PyPI - Version](https://img.shields.io/pypi/v/uipath-openai-agents)](https://pypi.org/project/uipath-openai-agents/)
[![PyPI downloads](https://img.shields.io/pypi/dm/uipath-openai-agents.svg)](https://pypi.org/project/uipath-openai-agents/)
[![Python versions](https://img.shields.io/pypi/pyversions/uipath-openai-agents.svg)](https://pypi.org/project/uipath-openai-agents/)

A Python SDK that enables developers to build and deploy OpenAI Agents to the UiPath Cloud Platform. It provides programmatic interaction with UiPath Cloud Platform services.

This package is an extension to the [UiPath Python SDK](https://github.com/UiPath/uipath-python) and implements the [UiPath Runtime Protocol](https://github.com/UiPath/uipath-runtime-python).

Check out these [sample projects](https://github.com/UiPath/uipath-integrations-python/tree/main/packages/uipath-openai-agents/samples) to see the SDK in action.

## Requirements

- Python 3.11 or higher
- UiPath Automation Cloud account

## Installation

```bash
pip install uipath-openai-agents
```

## Quick Start
using `uv`:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should make clear these are alternatives to the same thing (see LangChain/LLamaindex getting started examples)


```bash
uv add uipath-openai-agents

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

missing: uv init . --python 3.11 before uv add uipath-openai-agents

I get this error for the uv command:error: Nopyproject.toml found in current directory or any parent directory

```

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

even after using uv init . --python 3.11 and then uv add uipath-openai-agents I got this error message:

`Because only uipath-openai-agents==0.0.1 is available and uipath-openai-agents==0.0.1 depends on your project, we can conclude that
all versions of uipath-openai-agents depend on your project.
And because your project depends on uipath-openai-agents, we can conclude that your project's requirements are unsatisfiable.

  hint: The package `uipath-openai-agents` depends on the package `openai` but the name is shadowed by your project. Consider changing
  the name of the project.`

This is a limitation if the pyproject.toml name is openai

## Configuration

### Environment Variables

Create a `.env` file in your project root with the following variables:

```
UIPATH_URL=https://cloud.uipath.com/ACCOUNT_NAME/TENANT_NAME
UIPATH_ACCESS_TOKEN=YOUR_TOKEN_HERE
```

## Command Line Interface (CLI)

The SDK provides a command-line interface for creating, packaging, and deploying OpenAI Agents:

### Authentication

```bash
uipath auth

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no uipath command will work without venv activation

```

This command opens a browser for authentication and creates/updates your `.env` file with the proper credentials.

### Initialize a Project

```bash
uipath init
```

Running `uipath init` will process the agent definitions in the `openai_agents.json` file and create the corresponding `entry-points.json` file needed for deployment.

For more details on the configuration format, see the [UiPath configuration specifications](https://github.com/UiPath/uipath-python/blob/main/specs/README.md).

### Debug a Project

```bash
uipath run AGENT [INPUT]
```

Executes the agent with the provided JSON input arguments.

### Package a Project

```bash
uipath pack
```

Packages your project into a `.nupkg` file that can be deployed to UiPath.

**Note:** Your `pyproject.toml` must include:

- A description field (avoid characters: &, <, >, ", ', ;)
- Author information

Example:

```toml
description = "Your package description"
authors = [{name = "Your Name", email = "your.email@example.com"}]
```

### Publish a Package

```bash
uipath publish
```

Publishes the most recently created package to your UiPath Orchestrator.

## Project Structure

To properly use the CLI for packaging and publishing, your project should include:

See the [main repository documentation](../../docs/) for getting started guides and examples.
- A `pyproject.toml` file with project metadata
- A `openai_agents.json` file with your agent definitions (e.g., `"agents": {"agent": "main.py:agent"}`)
- A `entry-points.json` file (generated by `uipath init`)
- A `bindings.json` file (generated by `uipath init`) to configure resource overrides
- Any Python files needed for your automation

## Features
## Development

- **OpenAI Agents Integration**: Build agents using OpenAI's native Agents framework
- **Agent Orchestration**: Multi-agent coordination and communication
- **State Management**: Persistent agent state with SQLite sessions
- **UiPath Integration**: Seamless integration with UiPath runtime and tooling
### Developer Tools

## Status
Check out [uipath-dev](https://github.com/uipath/uipath-dev-python) - an interactive terminal application for building, testing, and debugging UiPath Python runtimes, agents, and automation scripts.

⚠️ **Early Development**: This package is in early development (v0.1.0). APIs may change as the OpenAI Agents framework evolves.
### Setting Up a Development Environment

## Documentation
Please read our [contribution guidelines](https://github.com/UiPath/uipath-integrations-python/packages/uipath-openai-agents/blob/main/CONTRIBUTING.md) before submitting a pull request.

Full documentation is available in the [main repository](https://github.com/UiPath/uipath-llamaindex-python).
### Special Thanks

## License
A huge thank-you to the open-source community and the maintainers of the libraries that make this project possible:

See [LICENSE](../../LICENSE) in the repository root.
- [OpenAI](https://github.com/openai/openai-python) for providing a powerful framework for building AI agents.
- [OpenInference](https://github.com/Arize-ai/openinference) for observability and instrumentation support.
- [Pydantic](https://github.com/pydantic/pydantic) for reliable, typed configuration and validation.
14 changes: 7 additions & 7 deletions packages/uipath-openai-agents/docs/quick_start.md
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ Generate your first UiPath OpenAI agent:
✓ Created 'pyproject.toml' file.
🔧 Please ensure to define OPENAI_API_KEY in your .env file.
💡 Initialize project: uipath init
💡 Run agent: uipath run agent '{"message": "Hello"}'
💡 Run agent: uipath run agent '{"messages": "Hello"}'
```

This command creates the following files:
Expand Down Expand Up @@ -173,7 +173,7 @@ Execute the agent with a sample input:
<!-- termynal -->

```shell
> uipath run agent '{"message": "Hello"}'
> uipath run agent '{"messages": "Hello"}'
{'response': 'Hello! How can I help you today?', 'agent_used': 'main'}
✓ Successful execution.
```
Expand All @@ -185,19 +185,19 @@ Depending on the shell you are using, it may be necessary to escape the input js

/// tab | Bash/ZSH/PowerShell
```console
uipath run agent '{"message": "Hello"}'
uipath run agent '{"messages": "Hello"}'
```
///

/// tab | Windows CMD
```console
uipath run agent "{""message"": ""Hello""}"
uipath run agent "{""messages"": ""Hello""}"
```
///

/// tab | Windows PowerShell
```console
uipath run agent '{\"message\":\"Hello\"}'
uipath run agent '{\"messages\":\"Hello\"}'
```
///

Expand All @@ -215,7 +215,7 @@ The `run` command can also take a .json file as an input. You can create a file

```json
{
"message": "Hello"
"messages": "Hello"
}
```

Expand Down Expand Up @@ -275,7 +275,7 @@ Set the environment variables using the provided link.
<!-- termynal -->

```shell
> uipath invoke agent '{"message": "Hello"}'
> uipath invoke agent '{"messages": "Hello"}'
⠴ Loading configuration ...
⠴ Starting job ...
✨ Job started successfully!
Expand Down
3 changes: 2 additions & 1 deletion packages/uipath-openai-agents/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,8 @@ plugins = [
"pydantic.mypy"
]
exclude = [
"samples/.*"
"samples/.*",
"testcases/.*"
]
follow_imports = "silent"
warn_redundant_casts = true
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
"message": "Tell me a joke"
"messages": "Tell me a joke"
}
3 changes: 2 additions & 1 deletion packages/uipath-openai-agents/samples/agent-as-tools/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
from pydantic import BaseModel, Field

from uipath_openai_agents.chat import UiPathChatOpenAI
from uipath_openai_agents.chat.supported_models import OpenAIModels

"""
This example shows the agents-as-tools pattern adapted for UiPath coded agents.
Expand Down Expand Up @@ -32,7 +33,7 @@ def main() -> Agent:
"""Configure UiPath OpenAI client and return the orchestrator agent."""
# Configure UiPath OpenAI client for agent execution
# This routes all OpenAI API calls through UiPath's LLM Gateway
MODEL = "gpt-4o-2024-11-20"
MODEL = OpenAIModels.gpt_5_1_2025_11_13
uipath_openai_client = UiPathChatOpenAI(model_name=MODEL)
_openai_shared.set_default_openai_client(uipath_openai_client.async_client)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -59,8 +59,10 @@ uv run uipath init --infer-bindings
| `--input-file` | value | `Sentinel.UNSET` | Alias for '-f/--file' arguments |
| `--output-file` | value | `Sentinel.UNSET` | File path where the output will be written |
| `--trace-file` | value | `Sentinel.UNSET` | File path where the trace spans will be written (JSON Lines format) |
| `--state-file` | value | `Sentinel.UNSET` | File path where the state file is stored for persisting execution state. If not provided, a temporary file will be used. |
| `--debug` | flag | false | Enable debugging with debugpy. The process will wait for a debugger to attach. |
| `--debug-port` | value | `5678` | Port for the debug server (default: 5678) |
| `--keep-state-file` | flag | false | Keep the temporary state file even when not resuming and no job id is provided |

**Usage Examples:**

Expand Down Expand Up @@ -102,6 +104,7 @@ uv run uipath run --resume
trace_file: File path where traces will be written in JSONL format
max_llm_concurrency: Maximum concurrent LLM requests
input_overrides: Input field overrides mapping (direct field override with deep merge)
resume: Resume execution from a previous suspended state


**Arguments:**
Expand All @@ -124,6 +127,7 @@ uv run uipath run --resume
| `--model-settings-id` | value | `"default"` | Model settings ID from evaluation set to override agent settings (default: 'default') |
| `--trace-file` | value | `Sentinel.UNSET` | File path where traces will be written in JSONL format |
| `--max-llm-concurrency` | value | `20` | Maximum concurrent LLM requests (default: 20) |
| `--resume` | flag | false | Resume execution from a previous suspended state |

**Usage Examples:**

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,25 @@ sdk = UiPath()
sdk = UiPath(base_url="https://cloud.uipath.com/...", secret="your_token")
```

### Agenthub

Agenthub service

```python
# Fetch available models from LLM Gateway discovery endpoint.
sdk.agenthub.get_available_llm_models(headers: dict[str, Any] | None=None) -> list[uipath.platform.agenthub.agenthub.LlmModel]

# Asynchronously fetch available models from LLM Gateway discovery endpoint.
sdk.agenthub.get_available_llm_models_async(headers: dict[str, Any] | None=None) -> list[uipath.platform.agenthub.agenthub.LlmModel]

# Start a system agent job.
sdk.agenthub.invoke_system_agent(agent_name: str, entrypoint: str, input_arguments: dict[str, Any] | None=None, folder_key: str | None=None, folder_path: str | None=None, headers: dict[str, Any] | None=None) -> str

# Asynchronously start a system agent and return the job.
sdk.agenthub.invoke_system_agent_async(agent_name: str, entrypoint: str, input_arguments: dict[str, Any] | None=None, folder_key: str | None=None, folder_path: str | None=None, headers: dict[str, Any] | None=None) -> str

```

### Api Client

Api Client service
Expand All @@ -31,6 +50,12 @@ service = sdk.api_client
Assets service

```python
# List assets using OData API with offset-based pagination.
sdk.assets.list(folder_path: Optional[str]=None, folder_key: Optional[str]=None, filter: Optional[str]=None, orderby: Optional[str]=None, skip: int=0, top: int=100) -> uipath.platform.common.paging.PagedResult[uipath.platform.orchestrator.assets.Asset]

# Asynchronously list assets using OData API with offset-based pagination.
sdk.assets.list_async(folder_path: Optional[str]=None, folder_key: Optional[str]=None, filter: Optional[str]=None, orderby: Optional[str]=None, skip: int=0, top: int=100) -> uipath.platform.common.paging.PagedResult[uipath.platform.orchestrator.assets.Asset]

# Retrieve an asset by its name.
sdk.assets.retrieve(name: str, folder_key: Optional[str]=None, folder_path: Optional[str]=None) -> uipath.platform.orchestrator.assets.UserAsset | uipath.platform.orchestrator.assets.Asset

Expand Down Expand Up @@ -505,7 +530,7 @@ Llm service

```python
# Generate chat completions using UiPath's normalized LLM Gateway API.
sdk.llm.chat_completions(messages: list[dict[str, str]] | list[tuple[str, str]], model: str="gpt-4o-mini-2024-07-18", max_tokens: int=4096, temperature: float=0, n: int=1, frequency_penalty: float=0, presence_penalty: float=0, top_p: float | None=1, top_k: int | None=None, tools: list[uipath.platform.chat.llm_gateway.ToolDefinition] | None=None, tool_choice: Union[uipath.platform.chat.llm_gateway.AutoToolChoice, uipath.platform.chat.llm_gateway.RequiredToolChoice, uipath.platform.chat.llm_gateway.SpecificToolChoice, Literal['auto', 'none'], NoneType]=None, response_format: dict[str, Any] | type[pydantic.main.BaseModel] | None=None, api_version: str="2024-08-01-preview")
sdk.llm.chat_completions(messages: list[dict[str, str]] | list[tuple[str, str]], model: str="gpt-4.1-mini-2025-04-14", max_tokens: int=4096, temperature: float=0, n: int=1, frequency_penalty: float=0, presence_penalty: float=0, top_p: float | None=1, top_k: int | None=None, tools: list[uipath.platform.chat.llm_gateway.ToolDefinition] | None=None, tool_choice: Union[uipath.platform.chat.llm_gateway.AutoToolChoice, uipath.platform.chat.llm_gateway.RequiredToolChoice, uipath.platform.chat.llm_gateway.SpecificToolChoice, Literal['auto', 'none'], NoneType]=None, response_format: dict[str, Any] | type[pydantic.main.BaseModel] | None=None, api_version: str="2024-08-01-preview")

```

Expand All @@ -515,7 +540,7 @@ Llm Openai service

```python
# Generate chat completions using UiPath's LLM Gateway service.
sdk.llm_openai.chat_completions(messages: list[dict[str, str]], model: str="gpt-4o-mini-2024-07-18", max_tokens: int=4096, temperature: float=0, response_format: dict[str, Any] | type[pydantic.main.BaseModel] | None=None, api_version: str="2024-10-21")
sdk.llm_openai.chat_completions(messages: list[dict[str, str]], model: str="gpt-4.1-mini-2025-04-14", max_tokens: int=4096, temperature: float=0, response_format: dict[str, Any] | type[pydantic.main.BaseModel] | None=None, api_version: str="2024-10-21")

# Generate text embeddings using UiPath's LLM Gateway service.
sdk.llm_openai.embeddings(input: str, embedding_model: str="text-embedding-ada-002", openai_api_version: str="2024-10-21")
Expand Down
Loading