Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
ibm
GitHub Repository: ibm/watson-machine-learning-samples
Path: blob/master/cpd5.2/notebooks/python_sdk/deployments/foundation_models/Use watsonx to manage Prompt Template assets and create deployment.ipynb
6412 views
Kernel: watsonx-ai-samples-py-312

image

Use watsonx.ai python SDK to manage Prompt Template assets and create deployment

Disclaimers

  • Use only Projects and Spaces that are available in watsonx context.

Notebook content

This notebook contains the steps and code to demonstrate support for Prompt Template inference and their deployments.

Some familiarity with Python is helpful. This notebook uses Python 3.12.

Learning goal

The goal of this notebook is to demonstrate how to create a Prompt Template asset and deployment pointing on it. In general, a Prompt Template is a pattern for generating prompts for language models. A template may contain instruction, input/output prefixes, few-shot examples and appropriate context that may vary depending on different tasks.

Contents

This notebook contains the following parts:

Set up the environment

Before you use the sample code in this notebook, you must perform the following setup tasks:

  • Contact with your Cloud Pak for Data administrator and ask them for your account credentials

Install dependencies

Note: ibm-watsonx-ai documentation can be found here.

%pip install -U "langchain>=0.3.12,<0.4" | tail -n 1 %pip install -U ibm-watsonx-ai | tail -n 1
Successfully installed PyYAML-6.0.2 SQLAlchemy-2.0.41 annotated-types-0.7.0 anyio-4.9.0 certifi-2025.4.26 charset-normalizer-3.4.2 h11-0.16.0 httpcore-1.0.9 httpx-0.28.1 idna-3.10 jsonpatch-1.33 jsonpointer-3.0.0 langchain-0.3.25 langchain-core-0.3.62 langchain-text-splitters-0.3.8 langsmith-0.3.43 orjson-3.10.18 packaging-24.2 pydantic-2.11.5 pydantic-core-2.33.2 requests-2.32.3 requests-toolbelt-1.0.0 sniffio-1.3.1 tenacity-9.1.2 typing-extensions-4.13.2 typing-inspection-0.4.1 urllib3-2.4.0 zstandard-0.23.0 Successfully installed ibm-cos-sdk-2.14.1 ibm-cos-sdk-core-2.14.1 ibm-cos-sdk-s3transfer-2.14.1 ibm-watsonx-ai-1.3.23 jmespath-1.0.1 lomond-0.3.3 numpy-2.2.6 pandas-2.2.3 pytz-2025.2 requests-2.32.2 tabulate-0.9.0 tzdata-2025.2

Define credentials

Authenticate the watsonx.ai Runtime service on IBM Cloud Pak for Data. You need to provide the admin's username and the platform url.

username = "PASTE YOUR USERNAME HERE" url = "PASTE THE PLATFORM URL HERE"

Use the admin's api_key to authenticate watsonx.ai Runtime services:

import getpass from ibm_watsonx_ai import Credentials credentials = Credentials( username=username, api_key=getpass.getpass("Enter your watsonx.ai API key and hit enter: "), url=url, instance_id="openshift", version="5.2", )

Alternatively you can use the admin's password:

import getpass from ibm_watsonx_ai import Credentials if "credentials" not in locals() or not credentials.api_key: credentials = Credentials( username=username, password=getpass.getpass("Enter your watsonx.ai password and hit enter: "), url=url, instance_id="openshift", version="5.2", )

Working with projects

First of all, you need to create a project that will be used for your work. If you do not have a project created already, follow the steps below:

  • Open IBM Cloud Pak main page

  • Click all projects

  • Create an empty project

  • Copy project_id from url and paste it below

Action: Assign project ID below

import os try: project_id = os.environ["PROJECT_ID"] except KeyError: project_id = input("Please enter your project_id (hit enter): ")

Create APIClient instance

from ibm_watsonx_ai import APIClient client = APIClient(credentials, project_id)

Prompt Template on watsonx.ai

from ibm_watsonx_ai.foundation_models.prompts import ( PromptTemplateManager, PromptTemplate, ) from ibm_watsonx_ai.foundation_models.utils.enums import ( DecodingMethods, PromptTemplateFormats, ) from ibm_watsonx_ai.metanames import GenTextParamsMetaNames as GenParams

Create PromptTemplateManager instance

prompt_mgr = PromptTemplateManager(credentials=credentials, project_id=project_id)

List text models available on CPD cluster

for model in client.foundation_models.TextModels: print(f"- {model}")
- google/flan-ul2 - ibm/granite-13b-instruct-v2 - ibm/granite-3-2b-instruct - ibm/granite-3-8b-instruct - meta-llama/llama-3-3-70b-instruct - mistralai/ministral-8b-instruct - mistralai/mistral-small-instruct

Create a Prompt Template object and store it in the project

We use a PromptTemplate object to store the properties of a newly created prompt template. Prompt text is composed of the instruction, input/output prefixes, few-shot examples and input text. All of the mentioned fields may contain placeholders ({...}) with input variables and for the template to be valid, these input variables must be also specified in input_variables parameter.

prompt_template = PromptTemplate( name="New prompt", model_id="ibm/granite-3-8b-instruct", model_params={GenParams.DECODING_METHOD: "sample"}, description="My example", task_ids=["generation"], input_variables=["object"], instruction="Answer on the following question", input_prefix="Human", output_prefix="Assistant", input_text="What is {object} and how does it work?", examples=[ [ "What is a loan and how does it work?", "A loan is a debt that is repaid with interest over time.", ] ], )

Using store_prompt(prompt_template_id) method, one can store newly created prompt template in your ptoject.

stored_prompt_template = prompt_mgr.store_prompt(prompt_template=prompt_template)
print(f"Asset id: {stored_prompt_template.prompt_id}") print(f"Is it a template?: {stored_prompt_template.is_template}")
Asset id: 6659ce24-d5ca-4c51-87d2-b7689a7785a0 Is it a template?: True

We can also store a template which is a langchain Prompt Template object.

from langchain.prompts import PromptTemplate as LcPromptTemplate langchain_prompt_template = LcPromptTemplate( template="What is {object} and how does it work?", input_variables=["object"] ) stored_prompt_template_lc = prompt_mgr.store_prompt( prompt_template=langchain_prompt_template ) print(f"Asset id: {stored_prompt_template_lc.prompt_id}")
Asset id: 93dad1e7-da70-49cd-b97b-c51b7ea71620

Manage Prompt Templates

prompt_mgr.list()

To retrive Prompt Template asset from project and return string that contains Prompt Template input we use load_prompt(prompt_template_id, astype=...). Returned input string is composed of the fields: instruction, input_prefix, output_prefix, examples and input_text. After substituting prompt variables, one can evaluate a language model on the obtained string.

prompt_text = prompt_mgr.load_prompt( prompt_id=stored_prompt_template.prompt_id, astype=PromptTemplateFormats.STRING ) print(prompt_text)
Answer on the following question Human What is a loan and how does it work? Assistant A loan is a debt that is repaid with interest over time. Human What is {object} and how does it work? Assistant

To update Prompt Template data use prompt_mgr.update_prompt method.

prompt_with_new_instruction = PromptTemplate( instruction="Answer on the following question in a concise way." ) prompt_mgr.update_prompt( prompt_id=stored_prompt_template.prompt_id, prompt_template=prompt_with_new_instruction, ) prompt_text = prompt_mgr.load_prompt( prompt_id=stored_prompt_template.prompt_id, astype=PromptTemplateFormats.STRING ) print(prompt_text)
Answer on the following question in a concise way. Human What is a loan and how does it work? Assistant A loan is a debt that is repaid with interest over time. Human What is {object} and how does it work? Assistant

Furthermore, to get information about locked state of Prompt Template run following method

prompt_mgr.get_lock(prompt_id=stored_prompt_template.prompt_id)
{'locked': True, 'locked_by': '1000330999', 'lock_type': 'edit'}

Using lock or unlock method, one can change locked state of Prompt Template asset.

prompt_mgr.unlock(prompt_id=stored_prompt_template_lc.prompt_id)
{'locked': False}

Once the prompt template is unlocked it can be deleted. You can also use the list method to check the available prompt templates (passing limit=2 parameter will return only 2 recently created templates).

print( f"Id of the Prompt Template asset that will be deleted: {stored_prompt_template_lc.prompt_id}" ) prompt_mgr.delete_prompt(prompt_id=stored_prompt_template_lc.prompt_id) prompt_mgr.list(limit=2)
Id of the Prompt Template asset that will be deleted: 93dad1e7-da70-49cd-b97b-c51b7ea71620

Deployment pointing to Prompt Template

To create deployment pointing to a Prompt template asset we have to initialized APIClient object.

client.set.default_project(project_id)
'SUCCESS'

In the deployment exmaple we will use the prompt with the following input

prompt_input_text = prompt_mgr.load_prompt( prompt_id=stored_prompt_template.prompt_id, astype=PromptTemplateFormats.STRING ) print(prompt_input_text)
Answer on the following question in a concise way. Human What is a loan and how does it work? Assistant A loan is a debt that is repaid with interest over time. Human What is {object} and how does it work? Assistant

Now, we create deployment providing the ID of Prompt Template asset and meta props.

meta_props = { client.deployments.ConfigurationMetaNames.NAME: "SAMPLE DEPLOYMENT PROMPT TEMPLATE", client.deployments.ConfigurationMetaNames.ONLINE: {}, } deployment_details = client.deployments.create( artifact_uid=stored_prompt_template.prompt_id, meta_props=meta_props )
###################################################################################### Synchronous deployment creation for id: '6659ce24-d5ca-4c51-87d2-b7689a7785a0' started ###################################################################################### initializing Note: online_url is deprecated and will be removed in a future release. Use serving_urls instead. ready ----------------------------------------------------------------------------------------------- Successfully finished deployment creation, deployment_id='55a73e9a-b9c3-433f-8124-c209489cd3f9' -----------------------------------------------------------------------------------------------
client.deployments.list()

Substitute prompt variables and generate text

deployment_id = deployment_details.get("metadata", {}).get("id")
client.deployments.generate_text( deployment_id, params={ "prompt_variables": {"object": "a mortgage"}, GenParams.DECODING_METHOD: DecodingMethods.GREEDY, GenParams.STOP_SEQUENCES: ["\n\n"], GenParams.MAX_NEW_TOKENS: 50, }, )
' A mortgage is a type of loan used to purchase real estate. The borrower agrees to repay the loan, plus interest, over a set period, typically 15 to 30 years. The property serves as collateral'

Generate text using ModelInference

You can also generate text based on your Prompt Template deployment using ModelInference class.

from ibm_watsonx_ai.foundation_models import ModelInference model_inference = ModelInference( deployment_id=deployment_id, credentials=credentials, project_id=project_id )
model_inference.generate_text( params={ "prompt_variables": {"object": "a mortgage"}, GenParams.DECODING_METHOD: DecodingMethods.GREEDY, GenParams.STOP_SEQUENCES: ["\n\n"], GenParams.MAX_NEW_TOKENS: 50, } )
' A mortgage is a type of loan used to purchase real estate. The borrower agrees to repay the loan, plus interest, over a set period, typically 15 to 30 years. The property serves as collateral'

Summary and next steps

You successfully completed this notebook!

You learned how to create valid Prompt Template and store it in watsonx.ai project. Furthermore, you also learned how to create deployment pointing to a Prompt Template asset and generate text using base model.

Check out our Online Documentation for more samples, tutorials, documentation, how-tos, and blog posts.

Authors

Mateusz Świtała, Software Engineer at watsonx.ai.

Mateusz Szewczyk, Software Engineer at watsonx.ai.

Copyright © 2023-2025 IBM. This notebook and its source code are released under the terms of the MIT License.