Common base class for all non-exit exceptions.
Inherited Members
- builtins.BaseException
- with_traceback
- args
Unspecified run-time error.
Inherited Members
- builtins.BaseException
- with_traceback
- args
Base class for all backend GraphQL API objects.
This class serves as a framework for API objects to be constructed based on a partially-complete JSON response, and to fill in their remaining members dynamically if needed.
Base constructor.
Each object requires:
- a GraphQL client - in order to fill its missing members dynamically
- an initial data blob - typically from unserialized JSON, contains at
- least the data for required members (typically the object's primary key) and optionally other data members.
Deploys this deployment, if it is not already deployed.
If the deployment is already deployed, this is a no-op.
Shuts down this deployment, if it is deployed.
If the deployment is already undeployed, this is a no-op.
Returns a dict of deployment status useful for determining if a deployment has succeeded.
Returns
Dict of deployment internal state information.
Waits for the deployment status to enter the "Running" state.
Will wait up "timeout_request" seconds for the deployment to enter that state. This is set in the "Client" object constructor. Will raise various exceptions on failures.
Returns
The deployment, for chaining.
Waits for the deployment to end.
Will wait up "timeout_request" seconds for the deployment to enter that state. This is set in the "Client" object constructor. Will raise various exceptions on failures.
Returns
The deployment, for chaining.
Returns an inference result on this deployment, given a tensor.
Parameters
- tensor: Union[Dict[str, Any], pd.DataFrame, pa.Table] Inference data. Should be a dictionary. Future improvement: will be a pandas dataframe or arrow table
- timeout: Optional[Union[int, float]] infer requests will time out after the amount of seconds provided are exceeded. timeout defaults to 15 secs.
- dataset: Optional[Union[Sequence[str], str]] By default this is set to return, ["time", "out"]. Other available options "check_failures", "metadata"
- dataset_exclude: Optional[Union[Sequence[str], str]] If set, allows user to exclude parts of dataset.
- dataset_separator: Optional[str] If set to ".", return dataset will be flattened.
Returns
InferenceResult in dictionary, dataframe or arrow format.
Async method to run batched inference on a data file for a given deployment.
Parameters
- str filename: path to an existing file with tensor data in JSON format.
- str data_key: key which the tensor data is under within the JSON. defaults to "tensor".
- int batch_size: batch size to use when sending requests to the engine. defaults to 1000.
- int connector_limit: limit for the amount of TCP connections. defaults to 4.
Returns
List of InferenceResult's.
Replaces the current model with a default-configured Model.
Parameters
- Model model: Model variant to replace current model with
Replaces the current model with a configured variant.
Parameters
- ModelConfig model_config: Configured model to replace current model with
Returns the internal inference URL that is only reachable from inside of the Wallaroo cluster by SDK instances deployed in the cluster.
If both pipelines and models are configured on the Deployment, this gives preference to pipelines. The returned URL is always for the first configured pipeline or model.