roboto.domain.datasets.operations#
Module Contents#
- class roboto.domain.datasets.operations.BeginManifestTransactionRequest(/, **data)#
Bases:
pydantic.BaseModel
Request payload to begin a manifest-based transaction
- Parameters:
data (Any)
- device_id: str | None = None#
The ID of the device which created this dataset, if applicable.
- origination: str#
Additional information about what uploaded the file, e.g. roboto client v1.0.0.
- resource_manifest: dict[str, int]#
Dictionary mapping destination file paths to file sizes in bytes.
- class roboto.domain.datasets.operations.BeginManifestTransactionResponse(/, **data)#
Bases:
pydantic.BaseModel
Response to a manifest-based transaction request
- Parameters:
data (Any)
- transaction_id: str#
- upload_mappings: dict[str, str]#
- class roboto.domain.datasets.operations.BeginSingleFileUploadRequest(/, **data)#
Bases:
pydantic.BaseModel
Request payload to begin a single file upload
- Parameters:
data (Any)
- file_path: str = None#
- file_size: int = None#
- origination: str | None = None#
- class roboto.domain.datasets.operations.BeginSingleFileUploadResponse(/, **data)#
Bases:
pydantic.BaseModel
Response to a single file upload
- Parameters:
data (Any)
- upload_id: str#
- upload_url: str#
- class roboto.domain.datasets.operations.BeginTransactionRequest(/, **data)#
Bases:
pydantic.BaseModel
Request payload for beginning a file upload transaction.
Used to initiate a transaction for uploading multiple files to a dataset. Transactions help coordinate batch uploads and provide progress tracking.
- Parameters:
data (Any)
- expected_resource_count: int | None = None#
Optional expected number of resources to be uploaded in this transaction.
- origination: str#
Description of the upload source (e.g., ‘roboto client v1.0.0’).
- class roboto.domain.datasets.operations.CreateDatasetIfNotExistsRequest(/, **data)#
Bases:
pydantic.BaseModel
Request payload to create a dataset if no existing dataset matches the specified query.
Searches for existing datasets using the provided RoboQL query. If a matching dataset is found, returns that dataset. If no match is found, creates a new dataset with the specified properties and returns it.
- Parameters:
data (Any)
- create_request: CreateDatasetRequest#
- match_roboql_query: str#
- class roboto.domain.datasets.operations.CreateDatasetRequest(**data)#
Bases:
pydantic.BaseModel
Request payload for creating a new dataset.
Used to specify the initial properties of a dataset during creation, including optional metadata, tags, name, and description.
- description: str | None = None#
Optional human-readable description of the dataset.
- device_id: str | None = None#
Optional identifier of the device that generated this data.
- metadata: dict[str, Any] = None#
Key-value metadata pairs to associate with the dataset for discovery and search.
- name: str | None = None#
Optional short name for the dataset (max 120 characters).
- tags: list[str] = None#
List of tags for dataset discovery and organization.
- class roboto.domain.datasets.operations.CreateDirectoryRequest(/, **data)#
Bases:
pydantic.BaseModel
Request payload to create a directory in a dataset
- Parameters:
data (Any)
- create_intermediate_dirs: bool = False#
If True, creates intermediate directories in the path if they don’t exist. If False, requires all parent directories to already exist.
- error_if_exists: bool = False#
- name: str#
- origination: str | None = None#
- parent_path: str | None = None#
- class roboto.domain.datasets.operations.DeleteDirectoriesRequest(/, **data)#
Bases:
pydantic.BaseModel
Request payload for deleting directories within a dataset.
Used to remove entire directory structures and all contained files from a dataset. This is a bulk operation that affects multiple files.
- Parameters:
data (Any)
- directory_paths: list[str]#
List of directory paths to delete from the dataset.
- class roboto.domain.datasets.operations.QueryDatasetFilesRequest(/, **data)#
Bases:
pydantic.BaseModel
Request payload for querying files within a dataset.
Used to retrieve files from a dataset with optional pattern-based filtering and pagination support. Supports gitignore-style patterns for flexible file selection.
- Parameters:
data (Any)
- exclude_patterns: list[str] | None = None#
List of gitignore-style patterns for files to exclude from results.
- include_patterns: list[str] | None = None#
List of gitignore-style patterns for files to include in results.
- page_token: str | None = None#
Token for retrieving the next page of results in paginated queries.
- class roboto.domain.datasets.operations.QueryDatasetsRequest(/, **data)#
Bases:
pydantic.BaseModel
Request payload for querying datasets with filters.
Used to search for datasets based on various criteria such as metadata, tags, and other dataset properties. The filters are applied server-side to efficiently return matching datasets.
- Parameters:
data (Any)
- filters: dict[str, Any] = None#
Dictionary of filter criteria to apply when searching for datasets.
- model_config#
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class roboto.domain.datasets.operations.RenameDirectoryRequest(/, **data)#
Bases:
pydantic.BaseModel
Request payload for renaming a directory within a dataset.
Used to change the path of a directory and all its contained files within a dataset. This updates the logical organization without moving actual file content.
- Parameters:
data (Any)
- new_path: str#
New path for the directory.
- old_path: str#
Current path of the directory to rename.
- class roboto.domain.datasets.operations.ReportTransactionProgressRequest(/, **data)#
Bases:
pydantic.BaseModel
Request payload for reporting file upload transaction progress.
Used to notify the platform about the completion status of individual files within a batch upload transaction. This enables progress tracking and partial completion handling for large file uploads.
- Parameters:
data (Any)
- manifest_items: list[str]#
List of manifest item identifiers that have completed upload.
- class roboto.domain.datasets.operations.TransactionCompletionResponse(/, **data)#
Bases:
pydantic.BaseModel
Response indicating the completion status of a transaction.
Provides information about whether a file upload transaction has been fully completed, including all associated file processing.
- Parameters:
data (Any)
- is_complete: bool#
Whether the transaction has been fully completed.
- class roboto.domain.datasets.operations.UpdateDatasetRequest(/, **data)#
Bases:
pydantic.BaseModel
Request payload for updating dataset properties.
Used to modify dataset metadata, description, name, and other properties. Supports conditional updates based on current field values to prevent conflicting concurrent modifications.
- Parameters:
data (Any)
- conditions: list[roboto.updates.UpdateCondition] | None = None#
Optional list of conditions that must be met for the update to proceed.
- description: str | None = None#
New description for the dataset.
- device_id: str | None = None#
New device ID for the dataset.
- metadata_changeset: roboto.updates.MetadataChangeset | None = None#
Metadata changes to apply (add, update, or remove fields/tags).
- name: str | None = None#
New name for the dataset (max 120 characters).