Skip to content

Commit 90c4aad

Browse files
authored
Merge pull request #145 from UiPath/fix/input_parameters
fix: parse context input
2 parents 65901a1 + 2192196 commit 90c4aad

17 files changed

Lines changed: 1028 additions & 119 deletions

File tree

packages/uipath-openai-agents/pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "uipath-openai-agents"
3-
version = "0.0.3"
3+
version = "0.0.4"
44
description = "UiPath OpenAI Agents SDK"
55
readme = "README.md"
66
requires-python = ">=3.11"

packages/uipath-openai-agents/samples/agent-as-tools/.agent/CLI_REFERENCE.md

Lines changed: 55 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -59,8 +59,10 @@ uv run uipath init --infer-bindings
5959
| `--input-file` | value | `Sentinel.UNSET` | Alias for '-f/--file' arguments |
6060
| `--output-file` | value | `Sentinel.UNSET` | File path where the output will be written |
6161
| `--trace-file` | value | `Sentinel.UNSET` | File path where the trace spans will be written (JSON Lines format) |
62+
| `--state-file` | value | `Sentinel.UNSET` | File path where the state file is stored for persisting execution state. If not provided, a temporary file will be used. |
6263
| `--debug` | flag | false | Enable debugging with debugpy. The process will wait for a debugger to attach. |
6364
| `--debug-port` | value | `5678` | Port for the debug server (default: 5678) |
65+
| `--keep-state-file` | flag | false | Keep the temporary state file even when not resuming and no job id is provided |
6466

6567
**Usage Examples:**
6668

@@ -99,6 +101,10 @@ uv run uipath run --resume
99101
enable_mocker_cache: Enable caching for LLM mocker responses
100102
report_coverage: Report evaluation coverage
101103
model_settings_id: Model settings ID to override agent settings
104+
trace_file: File path where traces will be written in JSONL format
105+
max_llm_concurrency: Maximum concurrent LLM requests
106+
input_overrides: Input field overrides mapping (direct field override with deep merge)
107+
resume: Resume execution from a previous suspended state
102108

103109

104110
**Arguments:**
@@ -120,6 +126,8 @@ uv run uipath run --resume
120126
| `--report-coverage` | flag | false | Report evaluation coverage |
121127
| `--model-settings-id` | value | `"default"` | Model settings ID from evaluation set to override agent settings (default: 'default') |
122128
| `--trace-file` | value | `Sentinel.UNSET` | File path where traces will be written in JSONL format |
129+
| `--max-llm-concurrency` | value | `20` | Maximum concurrent LLM requests (default: 20) |
130+
| `--resume` | flag | false | Resume execution from a previous suspended state |
123131

124132
**Usage Examples:**
125133

@@ -226,6 +234,53 @@ The `uipath.json` file is automatically generated by `uipath init` and defines y
226234

227235
The UiPath CLI provides commands for interacting with UiPath platform services. These commands allow you to manage buckets, assets, jobs, and other resources.
228236

237+
### `uipath assets`
238+
239+
Manage UiPath assets.
240+
241+
Assets are key-value pairs that store configuration data, credentials,
242+
and settings used by automation processes.
243+
244+
\b
245+
Examples:
246+
# List all assets in a folder
247+
uipath assets list --folder-path "Shared"
248+
249+
# List with filter
250+
uipath assets list --filter "ValueType eq 'Text'"
251+
252+
# List with ordering
253+
uipath assets list --orderby "Name asc"
254+
255+
256+
**Subcommands:**
257+
258+
**`uipath assets list`**
259+
260+
List assets in a folder.
261+
262+
\b
263+
Examples:
264+
uipath assets list
265+
uipath assets list --folder-path "Shared"
266+
uipath assets list --filter "ValueType eq 'Text'"
267+
uipath assets list --filter "Name eq 'MyAsset'"
268+
uipath assets list --orderby "Name asc"
269+
uipath assets list --top 50 --skip 100
270+
271+
272+
Options:
273+
- `--filter`: OData $filter expression (default: `Sentinel.UNSET`)
274+
- `--orderby`: OData $orderby expression (default: `Sentinel.UNSET`)
275+
- `--top`: Maximum number of items to return (default: 100, max: 1000) (default: `100`)
276+
- `--skip`: Number of items to skip (default: `0`)
277+
- `--folder-path`: Folder path (e.g., "Shared"). Can also be set via UIPATH_FOLDER_PATH environment variable. (default: `Sentinel.UNSET`)
278+
- `--folder-key`: Folder key (UUID) (default: `Sentinel.UNSET`)
279+
- `--format`: Output format (overrides global) (default: `Sentinel.UNSET`)
280+
- `--output`, `-o`: Output file (overrides global) (default: `Sentinel.UNSET`)
281+
282+
---
283+
229284
### `uipath buckets`
230285

231286
Manage UiPath storage buckets and files.

packages/uipath-openai-agents/samples/agent-as-tools/.agent/SDK_REFERENCE.md

Lines changed: 39 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,25 @@ sdk = UiPath()
1616
sdk = UiPath(base_url="https://cloud.uipath.com/...", secret="your_token")
1717
```
1818

19+
### Agenthub
20+
21+
Agenthub service
22+
23+
```python
24+
# Fetch available models from LLM Gateway discovery endpoint.
25+
sdk.agenthub.get_available_llm_models(headers: dict[str, Any] | None=None) -> list[uipath.platform.agenthub.agenthub.LlmModel]
26+
27+
# Asynchronously fetch available models from LLM Gateway discovery endpoint.
28+
sdk.agenthub.get_available_llm_models_async(headers: dict[str, Any] | None=None) -> list[uipath.platform.agenthub.agenthub.LlmModel]
29+
30+
# Start a system agent job.
31+
sdk.agenthub.invoke_system_agent(agent_name: str, entrypoint: str, input_arguments: dict[str, Any] | None=None, folder_key: str | None=None, folder_path: str | None=None, headers: dict[str, Any] | None=None) -> str
32+
33+
# Asynchronously start a system agent and return the job.
34+
sdk.agenthub.invoke_system_agent_async(agent_name: str, entrypoint: str, input_arguments: dict[str, Any] | None=None, folder_key: str | None=None, folder_path: str | None=None, headers: dict[str, Any] | None=None) -> str
35+
36+
```
37+
1938
### Api Client
2039

2140
Api Client service
@@ -31,6 +50,12 @@ service = sdk.api_client
3150
Assets service
3251

3352
```python
53+
# List assets using OData API with offset-based pagination.
54+
sdk.assets.list(folder_path: Optional[str]=None, folder_key: Optional[str]=None, filter: Optional[str]=None, orderby: Optional[str]=None, skip: int=0, top: int=100) -> uipath.platform.common.paging.PagedResult[uipath.platform.orchestrator.assets.Asset]
55+
56+
# Asynchronously list assets using OData API with offset-based pagination.
57+
sdk.assets.list_async(folder_path: Optional[str]=None, folder_key: Optional[str]=None, filter: Optional[str]=None, orderby: Optional[str]=None, skip: int=0, top: int=100) -> uipath.platform.common.paging.PagedResult[uipath.platform.orchestrator.assets.Asset]
58+
3459
# Retrieve an asset by its name.
3560
sdk.assets.retrieve(name: str, folder_key: Optional[str]=None, folder_path: Optional[str]=None) -> uipath.platform.orchestrator.assets.UserAsset | uipath.platform.orchestrator.assets.Asset
3661

@@ -340,12 +365,24 @@ sdk.documents.retrieve_ixp_extraction_result(project_id: str, tag: str, operatio
340365
# Asynchronous version of the [`retrieve_ixp_extraction_result`][uipath.platform.documents._documents_service.DocumentsService.retrieve_ixp_extraction_result] method.
341366
sdk.documents.retrieve_ixp_extraction_result_async(project_id: str, tag: str, operation_id: str) -> uipath.platform.documents.documents.ExtractionResponseIXP
342367

368+
# Retrieve the result of an IXP create validate extraction action operation (single-shot, non-blocking).
369+
sdk.documents.retrieve_ixp_extraction_validation_result(project_id: str, tag: str, operation_id: str) -> uipath.platform.documents.documents.ValidateExtractionAction
370+
371+
# Asynchronous version of the [`retrieve_ixp_extraction_validation_result`][uipath.platform.documents._documents_service.DocumentsService.retrieve_ixp_extraction_validation_result] method.
372+
sdk.documents.retrieve_ixp_extraction_validation_result_async(project_id: str, tag: str, operation_id: str) -> uipath.platform.documents.documents.ValidateExtractionAction
373+
343374
# Start an IXP extraction process without waiting for results (non-blocking).
344375
sdk.documents.start_ixp_extraction(project_name: str, tag: str, file: Union[IO[bytes], bytes, str, NoneType]=None, file_path: Optional[str]=None) -> uipath.platform.documents.documents.StartExtractionResponse
345376

346377
# Asynchronous version of the [`start_ixp_extraction`][uipath.platform.documents._documents_service.DocumentsService.start_ixp_extraction] method.
347378
sdk.documents.start_ixp_extraction_async(project_name: str, tag: str, file: Union[IO[bytes], bytes, str, NoneType]=None, file_path: Optional[str]=None) -> uipath.platform.documents.documents.StartExtractionResponse
348379

380+
# Start an IXP extraction validation action without waiting for results (non-blocking).
381+
sdk.documents.start_ixp_extraction_validation(action_title: str, action_priority: <enum 'ActionPriority, action_catalog: str, action_folder: str, storage_bucket_name: str, storage_bucket_directory_path: str, extraction_response: uipath.platform.documents.documents.ExtractionResponseIXP) -> uipath.platform.documents.documents.StartOperationResponse
382+
383+
# Asynchronous version of the [`start_ixp_extraction_validation`][uipath.platform.documents._documents_service.DocumentsService.start_ixp_extraction_validation] method.
384+
sdk.documents.start_ixp_extraction_validation_async(action_title: str, action_priority: <enum 'ActionPriority, action_catalog: str, action_folder: str, storage_bucket_name: str, storage_bucket_directory_path: str, extraction_response: uipath.platform.documents.documents.ExtractionResponseIXP) -> uipath.platform.documents.documents.StartOperationResponse
385+
349386
```
350387

351388
### Entities
@@ -505,7 +542,7 @@ Llm service
505542

506543
```python
507544
# Generate chat completions using UiPath's normalized LLM Gateway API.
508-
sdk.llm.chat_completions(messages: list[dict[str, str]] | list[tuple[str, str]], model: str="gpt-4o-mini-2024-07-18", max_tokens: int=4096, temperature: float=0, n: int=1, frequency_penalty: float=0, presence_penalty: float=0, top_p: float | None=1, top_k: int | None=None, tools: list[uipath.platform.chat.llm_gateway.ToolDefinition] | None=None, tool_choice: Union[uipath.platform.chat.llm_gateway.AutoToolChoice, uipath.platform.chat.llm_gateway.RequiredToolChoice, uipath.platform.chat.llm_gateway.SpecificToolChoice, Literal['auto', 'none'], NoneType]=None, response_format: dict[str, Any] | type[pydantic.main.BaseModel] | None=None, api_version: str="2024-08-01-preview")
545+
sdk.llm.chat_completions(messages: list[dict[str, str]] | list[tuple[str, str]], model: str="gpt-4.1-mini-2025-04-14", max_tokens: int=4096, temperature: float=0, n: int=1, frequency_penalty: float=0, presence_penalty: float=0, top_p: float | None=1, top_k: int | None=None, tools: list[uipath.platform.chat.llm_gateway.ToolDefinition] | None=None, tool_choice: Union[uipath.platform.chat.llm_gateway.AutoToolChoice, uipath.platform.chat.llm_gateway.RequiredToolChoice, uipath.platform.chat.llm_gateway.SpecificToolChoice, Literal['auto', 'none'], NoneType]=None, response_format: dict[str, Any] | type[pydantic.main.BaseModel] | None=None, api_version: str="2024-08-01-preview")
509546

510547
```
511548

@@ -515,7 +552,7 @@ Llm Openai service
515552

516553
```python
517554
# Generate chat completions using UiPath's LLM Gateway service.
518-
sdk.llm_openai.chat_completions(messages: list[dict[str, str]], model: str="gpt-4o-mini-2024-07-18", max_tokens: int=4096, temperature: float=0, response_format: dict[str, Any] | type[pydantic.main.BaseModel] | None=None, api_version: str="2024-10-21")
555+
sdk.llm_openai.chat_completions(messages: list[dict[str, str]], model: str="gpt-4.1-mini-2025-04-14", max_tokens: int=4096, temperature: float=0, response_format: dict[str, Any] | type[pydantic.main.BaseModel] | None=None, api_version: str="2024-10-21")
519556

520557
# Generate text embeddings using UiPath's LLM Gateway service.
521558
sdk.llm_openai.embeddings(input: str, embedding_model: str="text-embedding-ada-002", openai_api_version: str="2024-10-21")

0 commit comments

Comments
 (0)