Orion Client Helpers¶
Collection Helpers¶
-
orionclient.helpers.collections.
try_hard_to_create_shard
(collection, filename, name='', metadata=None, attempts=3)¶ Tries hard to create a shard and upload a file to it. This is in addition to the built-in request level retries, and is necessary for reliability at large scales.
Warning
Deprecated. Use
Shard.create()
andShard.upload_file
Parameters: - collection (ShardCollection) – ShardCollection to associate the shard with
- filename (str) – Local file path to upload
- name (str) – Name of the Shard
- metadata (dict) – User-specified metadata
- attempts (int) – Number of create/upload attempts (not request-level retries)
Returns: Instance of Shard
Return type: Raises: AuthorizationRequired – If collection session doesn’t have valid credentials
-
orionclient.helpers.collections.
try_hard_to_download_shard
(shard, filename, attempts=3)¶ Tries hard to download a shard to a file. This is in addition to the built-in request level retries, and is necessary for reliability at large scales.
Warning
Deprecated. Use
Shard.download_to_file()
Parameters: - shard (Shard) – Shard to download
- filename (str) – Local file path to download to
- attempts (int) – Number of create/upload attempts (not request-level retries)
Raises: AuthorizationRequired – If shard session doesn’t have valid credentials
Parameterization Helpers¶
-
orionclient.helpers.parameterize.
parameterize_workfloe
(workfloe, name, defaults, parameters, parallel=False, wait=True, session=None)¶ Utility for running a WorkFloeSpec using different parameters. Often used for benchmarking WorkFloes, but can be used to run numerous jobs with different inputs.
Parameters: - workfloe (WorkFloeSpec) – Workfloe Spec to run jobs against
- name (string) – Name of the floe to run, will have the step append to it in the format <name>-<step>, where the step is the index of the parameter list that was used to run the job
- defaults (dict) – Default arguments to pass to all of the jobs
- parameters (list) – List of dictionaries that contain parameters to update defaults with for each job.
- parallel (bool) – Run jobs concurrently
- wait (bool) – Wait until all jobs are complete
- session (OrionSession) – Session to use, otherwise defaults to APISession
Returns: List of
WorkFloeJob
objectsRaises: ValidationError – Input is invalid
The following example shows how to launch parameterized workfloes.
from orionclient.types import WorkFloeSpec from orionclient.session import APISession from orionclient.helpers.parameterize import parameterize_workfloe # Get a WorkFloe specification by ID workfloe = APISession.get_resource(WorkFloeSpec, 999) # Define the values every floe will have defaults = {"promoted": {"in": 618}} parameters = [] # For each iteration run the dummy cube with the parameter step as the iteration number. for x in range(5): parameters.append({"cube": {"dummy": {"step": x}}}) # Returns a list of WorkFloeJob objects jobs = parameterize_workfloe( workfloe, "Example Parameterized Floe", defaults, parameters )
Download code
See also
Benchmarking Workfloes section
Workfloe Helpers¶
-
orionclient.helpers.floe.
get_current_workfloe_job
(session=None)¶ Utility function to use within cubes in Orion to retrieve a WorkFloeJob object representing the running floe.
Parameters: session (OrionSession) – Session to use, otherwise defaults to APISession Returns: WorkFloeJob Raises: OrionError: Cube is not in Orion or server error.