This tutorial and the assets can be downloaded as part of the Wallaroo Tutorials repository.
The following tutorial demonstrates deploying and serving an XGBoost Booster Multiclass Classification Softmax model to Wallaroo.
The following XGBoost model types are supported by Wallaroo. XGBoost models not supported by Wallaroo are supported via the Custom Models, also known as Bring Your Own Predict (BYOP).
XGBoost Model Type | Wallaroo Auto Packaging Supported |
---|---|
XGBClassifier | √ |
XGBRegressor | √ |
Booster Classifier | √ |
Booster Classifier | √ |
Booster Regressor | √ |
Booster Random Forest Regressor | √ |
Booster Random Forest Classifier | √ |
XGBRFClassifier | √ |
XGBRFRegressor | √ |
XGBRanker* | X |
Upload, deploy, and serve a sample XGBoost Booster Multiclass Classification Softmax model.
This tutorial provides the following:
./models/booster_multi_classification_softmax.pkl
: The sample XGBoost model that receives the sklearn.datasets.load_iris
dataset.The first step is to import the libraries we will need. See ./requirements.txt
for a list of additional libraries used with this tutorial.
import wallaroo
from wallaroo.deployment_config import DeploymentConfigBuilder
from wallaroo.pipeline import Pipeline
import pyarrow as pa
from wallaroo.framework import Framework
import pickle
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from xgboost import train, DMatrix
The next step is connect to Wallaroo through the Wallaroo client. The Python library is included in the Wallaroo install and available through the Jupyter Hub interface provided with your Wallaroo environment.
This is accomplished using the wallaroo.Client()
command, which provides a URL to grant the SDK permission to your specific Wallaroo environment. When displayed, enter the URL into a browser and confirm permissions. Store the connection into a variable that can be referenced later.
If logging into the Wallaroo instance through the internal JupyterHub service, use wl = wallaroo.Client()
. For more details on logging in through Wallaroo, see the Wallaroo SDK Essentials Guide: Client Connection.
wl = wallaroo.Client()
We’ll set the name of our workspace, pipeline, models and files. Workspace names must be unique across the Wallaroo workspace. For this, we’ll add in a randomly generated 4 characters to the workspace name to prevent collisions with other users’ workspaces. If running this tutorial, we recommend hard coding the workspace name so it will function in the same workspace each time it’s run.
workspace_name = f'xgboost-booster-multiclass-classification-softmax'
pipeline_name = f'xgboost-booster-multiclass-classification-softmax'
model_name = 'booster-multiclass-classification-softmax'
model_file_name = './models/booster_multi_classification_softmax.pkl'
We will now create the Wallaroo workspace to store our model and set it as the current workspace. Future commands will default to this workspace for pipeline creation, model uploads, etc. We’ll create our Wallaroo pipeline to deploy our model.
workspace = wl.get_workspace(name=workspace_name, create_if_not_exist=True)
wl.set_current_workspace(workspace)
pipeline = wl.build_pipeline(pipeline_name)
XGBoost models are uploaded to Wallaroo through the wallaroo.client.Client.upload_model
method.
The following parameters are available for XGBoost models.
Parameter | Type | Description |
---|---|---|
name | string (Required) | The name of the model. Model names are unique per workspace. Models that are uploaded with the same name are assigned as a new version of the model. |
path | string (Required) | The path to the model file being uploaded. |
framework | string (Required) | Set as Framework.XGBOOST . |
input_schema | pyarrow.lib.Schema (Required) | The input schema in Apache Arrow schema format. |
output_schema | pyarrow.lib.Schema (Required) | The output schema in Apache Arrow schema format. |
convert_wait | bool (Optional) (Default: True) |
|
Once the upload process starts, the model is containerized by the Wallaroo instance. This process may take up to 10 minutes.
The following is returned with a successful model upload and conversion.
Field | Type | Description |
---|---|---|
name | string | The name of the model. |
version | string | The model version as a unique UUID. |
file_name | string | The file name of the model as stored in Wallaroo. |
image_path | string | The image used to deploy the model in the Wallaroo engine. |
last_update_time | DateTime | When the model was last updated. |
First we configure the input and output schemas in PyArrow format.
input_schema = pa.schema([
pa.field('inputs', pa.list_(pa.float32(), list_size=4))
])
output_schema = pa.schema([
pa.field('predictions', pa.float32()),
])
With the input and output schemas defined, we now upload the XGBoost model.
model = wl.upload_model(model_name,
model_file_name,
framework=Framework.XGBOOST,
input_schema=input_schema,
output_schema=output_schema)
model
Waiting for model loading - this will take up to 10.0min.
Model is pending loading to a native runtime.
Model is attempting loading to a native runtime.successful
Ready
Name | booster-multiclass-classification-softmax |
Version | d7d80765-0bf9-42ee-b9ad-47102ecc44db |
File Name | booster_multi_classification_softmax.pkl |
SHA | 6ace49bb2a3514724937de8ec023a14c94ac71dde7dce220ad27e44d296b4b25 |
Status | ready |
Image Path | None |
Architecture | x86 |
Acceleration | none |
Updated At | 2024-19-Jul 16:35:38 |
Workspace id | 42 |
Workspace name | xgboost-booster-multiclass-classification-softmax |
With the model uploaded and packaged, we add the model as a pipeline model step. For our deployment, we will set a minimum deployment configuration - this is the amount of resources the deployed pipeline uses from the cluster.
Once set, we deploy the pipeline, which allocates the assigned resources for the cluster and makes it available for inference requests.
pipeline.add_model_step(model)
deployment_config = DeploymentConfigBuilder() \
.cpus(0.25).memory('1Gi') \
.build()
pipeline.deploy(deployment_config=deployment_config)
name | xgboost-booster-multiclass-classification-softmax |
---|---|
created | 2024-07-19 16:35:28.828304+00:00 |
last_updated | 2024-07-19 16:35:41.666829+00:00 |
deployed | True |
workspace_id | 42 |
workspace_name | xgboost-booster-multiclass-classification-softmax |
arch | x86 |
accel | none |
tags | |
versions | 5303f2dd-cffb-44d2-bbea-ae94465dd9ce, aa56d2f6-72d9-4194-9c7f-58c444f43d03 |
steps | booster-multiclass-classification-softmax |
published | False |
The dataset is from the sklearn.datasets.load_iris
examples. These are converted to a pandas DataFrame, that is submitted to the deployed model in Wallaroo for an inference request.
dataset = load_iris()
X, y = dataset.data, dataset.target
X_train, X_test, y_train, y_test = train_test_split(
X, y, test_size=0.2, random_state=42
)
dtrain = DMatrix(X_train, label=y_train)
dtest = DMatrix(X_test, label=y_test)
data = dtest.get_data().todense()[:100]
import pandas as pd
dataframe = pd.DataFrame({"inputs": data.tolist()})
results = pipeline.infer(dataframe)
results
time | in.inputs | out.predictions | anomaly.count | |
---|---|---|---|---|
0 | 2024-07-19 16:35:56.479 | [6.0999999046, 2.7999999523, 4.6999998093, 1.2... | 1.0 | 0 |
1 | 2024-07-19 16:35:56.479 | [5.6999998093, 3.7999999523, 1.7000000477, 0.3... | 0.0 | 0 |
2 | 2024-07-19 16:35:56.479 | [7.6999998093, 2.5999999046, 6.9000000954, 2.2... | 2.0 | 0 |
3 | 2024-07-19 16:35:56.479 | [6.0, 2.9000000954, 4.5, 1.5] | 1.0 | 0 |
4 | 2024-07-19 16:35:56.479 | [6.8000001907, 2.7999999523, 4.8000001907, 1.3... | 1.0 | 0 |
5 | 2024-07-19 16:35:56.479 | [5.4000000954, 3.4000000954, 1.5, 0.400000006] | 0.0 | 0 |
6 | 2024-07-19 16:35:56.479 | [5.5999999046, 2.9000000954, 3.5999999046, 1.2... | 1.0 | 0 |
7 | 2024-07-19 16:35:56.479 | [6.9000000954, 3.0999999046, 5.0999999046, 2.2... | 2.0 | 0 |
8 | 2024-07-19 16:35:56.479 | [6.1999998093, 2.2000000477, 4.5, 1.5] | 1.0 | 0 |
9 | 2024-07-19 16:35:56.479 | [5.8000001907, 2.7000000477, 3.9000000954, 1.2... | 1.0 | 0 |
10 | 2024-07-19 16:35:56.479 | [6.5, 3.2000000477, 5.0999999046, 2.0] | 2.0 | 0 |
11 | 2024-07-19 16:35:56.479 | [4.8000001907, 3.0, 1.3999999762, 0.1000000015] | 0.0 | 0 |
12 | 2024-07-19 16:35:56.479 | [5.5, 3.5, 1.2999999523, 0.200000003] | 0.0 | 0 |
13 | 2024-07-19 16:35:56.479 | [4.9000000954, 3.0999999046, 1.5, 0.1000000015] | 0.0 | 0 |
14 | 2024-07-19 16:35:56.479 | [5.0999999046, 3.7999999523, 1.5, 0.3000000119] | 0.0 | 0 |
15 | 2024-07-19 16:35:56.479 | [6.3000001907, 3.2999999523, 4.6999998093, 1.6... | 1.0 | 0 |
16 | 2024-07-19 16:35:56.479 | [6.5, 3.0, 5.8000001907, 2.2000000477] | 2.0 | 0 |
17 | 2024-07-19 16:35:56.479 | [5.5999999046, 2.5, 3.9000000954, 1.1000000238] | 1.0 | 0 |
18 | 2024-07-19 16:35:56.479 | [5.6999998093, 2.7999999523, 4.5, 1.2999999523] | 1.0 | 0 |
19 | 2024-07-19 16:35:56.479 | [6.4000000954, 2.7999999523, 5.5999999046, 2.2... | 2.0 | 0 |
20 | 2024-07-19 16:35:56.479 | [4.6999998093, 3.2000000477, 1.6000000238, 0.2... | 0.0 | 0 |
21 | 2024-07-19 16:35:56.479 | [6.0999999046, 3.0, 4.9000000954, 1.7999999523] | 2.0 | 0 |
22 | 2024-07-19 16:35:56.479 | [5.0, 3.4000000954, 1.6000000238, 0.400000006] | 0.0 | 0 |
23 | 2024-07-19 16:35:56.479 | [6.4000000954, 2.7999999523, 5.5999999046, 2.0... | 2.0 | 0 |
24 | 2024-07-19 16:35:56.479 | [7.9000000954, 3.7999999523, 6.4000000954, 2.0] | 2.0 | 0 |
25 | 2024-07-19 16:35:56.479 | [6.6999998093, 3.0, 5.1999998093, 2.2999999523] | 2.0 | 0 |
26 | 2024-07-19 16:35:56.479 | [6.6999998093, 2.5, 5.8000001907, 1.7999999523] | 2.0 | 0 |
27 | 2024-07-19 16:35:56.479 | [6.8000001907, 3.2000000477, 5.9000000954, 2.2... | 2.0 | 0 |
28 | 2024-07-19 16:35:56.479 | [4.8000001907, 3.0, 1.3999999762, 0.3000000119] | 0.0 | 0 |
29 | 2024-07-19 16:35:56.479 | [4.8000001907, 3.0999999046, 1.6000000238, 0.2... | 0.0 | 0 |
With the tutorial complete, we undeploy the pipeline and return the resources back to the cluster.
pipeline.undeploy()
name | xgboost-booster-multiclass-classification-softmax |
---|---|
created | 2024-07-19 16:35:28.828304+00:00 |
last_updated | 2024-07-19 16:35:41.666829+00:00 |
deployed | False |
workspace_id | 42 |
workspace_name | xgboost-booster-multiclass-classification-softmax |
arch | x86 |
accel | none |
tags | |
versions | 5303f2dd-cffb-44d2-bbea-ae94465dd9ce, aa56d2f6-72d9-4194-9c7f-58c444f43d03 |
steps | booster-multiclass-classification-softmax |
published | False |