Last updated: Feb 03, 2025
Find out what foundation models are available for use in IBM watsonx.ai.
The watsonx.ai Python library has a helper class for referencing the model IDs and names. For more information, see TextModels
.
The following code uses the TextModels
helper class to list the model IDs of the available models.
Python code
from ibm_watsonx_ai import APIClient
from ibm_watsonx_ai import Credentials
credentials = Credentials(
url = "https://{region}.ml.cloud.ibm.com",
api_key = {my-IBM-Cloud-API-key},
)
api_client = APIClient(credentials)
api_client.foundation_models.TextModels.show()
Sample output
{'GRANITE_13B_CHAT_V2': 'ibm/granite-13b-chat-v2',
'GRANITE_13B_INSTRUCT_V2': 'ibm/granite-13b-instruct-v2',
...
}
Example: View details of a foundation model
You can view details, such as a short description and foundation model limits, by using get_details()
.
Python code
from ibm_watsonx_ai.foundation_models import ModelInference
import json
model_id = api_client.foundation_models.TextModels.FLAN_T5_XXL
project_id = {my-project-ID}
model = ModelInference(model_id=model_id, project_id=project_id, api_client=api_client)
model_details = model.get_details()
print( json.dumps( model_details, indent=2 ) )
Note:
Replace {region}
, {my-IBM-Cloud-API-key}
, and {my-project-ID}
with valid values for your environment.
Sample output
{
"model_id": "google/flan-t5-xxl",
"label": "flan-t5-xxl-11b",
"provider": "Google",
"source": "Hugging Face",
"short_description": "flan-t5-xxl is an 11 billion parameter model based on the Flan-T5 family.",
...
}
The following code sample uses a foundation model's ID to view model details.
import json
model_id = api_client.foundation_models.TextModels.FLAN_T5_XXL
model_details = api_client.foundation_models.get_model_specs(model_id)
print( json.dumps( model_details, indent=2 ) )
You can specify the model_id
in inference requests as follows:
model = ModelInference(
model_id="google/flan-ul2",...
)
Parent topic: Python library