0 / 0
Requirements for using custom components with ML models

Requirements for using custom components with ML models

You can define your own transformers, estimators, functions, classes, and tensor operations in models that you deploy in IBM watsonx.ai Runtime as online deployments.

Defining and using custom components

To use custom components with your models, you need to package your custom components in a Python distribution package.

Package requirements

  • The package type must be: source distribution (distributions of type Wheel and Egg are not supported)
  • The package file format must be: .zip
  • Any third-party dependencies for your custom components must be installable by pip and must be passed to the install_requires argument of the setup function of the setuptools library.

Refer to: Creating a source distribution

Storing your custom package

You must take extra steps when you store your trained model in the watsonx.ai Runtime repository:

  • Store your custom package in the watsonx.ai Runtime repository (use the runtimes.store_library function from the watsonx.ai Python client, or the store libraries watsonx.ai Runtime CLI command.)
  • Create a runtime resource object that references your stored custom package, and then store the runtime resource object in the watsonx.ai Runtime repository (use the runtimes.store function, or the store runtimes command.)
  • When you store your trained model in the watsonx.ai Runtime repository, reference your stored runtime resource in the metadata that is passed to the store_model function (or the store command.)

Supported frameworks

These frameworks support custom components:

  • Scikit-learn
  • XGBoost
  • Tensorflow
  • Python Functions
  • Python Scripts
  • Decision Optimization

For more information, see Supported frameworks

 

Parent topic: Customizing deployment runtimes

Generative AI search and answer
These answers are generated by a large language model in watsonx.ai based on content from the product documentation. Learn more