Environments

How to run the Wallaroo Inference Server in any cloud, any architecture, any platform.

Models uploaded to Wallaroo Ops and added as pipeline steps can be deployed in any edge and multi-cloud environment by:

  • Publish the Pipeline: Publish the pipeline and model artifacts to an Open Container Initiative (OCI) registry.
  • Deploy the Pipeline: Deploy the pipeline on the target environment and hardware.
  • Inference on the Pipeline: Use HTTP requests to perform inference requests on the deployed pipeline and models.

Edge Publish

How to publish models for deployment on edge and multicloud environments.

Edge Deployment

How to run the Wallaroo Inference Server in any cloud, any architecture, any platform.

Edge Inference

How to perform inferences on deployed models in edge and multicloud environments.