Path: blob/master/cloud/notebooks/python_sdk/deployments/foundation_models/Use watsonx, and LlamaIndex with sql querey engine.ipynb
6405 views
Use watsonx, and LlamaIndex for Text-to-SQL task
Disclaimers
Use only Projects and Spaces that are available in watsonx context.
Notebook content
This notebook contains the steps and code to demonstrate construction of natural language queries that are synthesized into SQL queries using watsonx.ai LlamaIndex integration and LlamaIndex query engine.
Some familiarity with Python is helpful. This notebook uses Python 3.11.
Learning goal
The goal of this notebook is to demonstrate how to use WatsonxLLM
and WatsonxEmbeddings
together with LlamaIndex SQL query engine to retrieve required information from SQL database and answer given question based on send data.
Contents
This notebook contains the following parts:
Set up the environment
Before you use the sample code in this notebook, you must perform the following setup tasks:
Create a watsonx.ai Runtime Service instance (a free plan is offered and information about how to create the instance can be found here).
Install dependecies
Defining the watsonx.ai credentials
This cell defines the watsonx.ai credentials required to work with watsonx Foundation Model inferencing.
Action: Provide the IBM Cloud user API key. For details, see documentation.
Defining the project id
The Foundation Model requires project id that provides the context for the call. We will obtain the id from the project in which this notebook runs. Otherwise, please provide the project id.
At the begining we generate dummy data. We create two tables, city_stats
that is taken from LlamaIndex documentation and sales_stats
, and using sqlite we store created tables in memory.
In this section we are going to initlize the core components of SQL query engine, i.e. LlamaIndex WatsonxLLM
client for model inferences and WatsonxEmbeddings
that we will use to embed tables schema and then when fetching data from index.
Action: For more details check the LlamaIndex documentation
LlamaIndex integration
WatsonxLLM
is a wrapper around watsonx.ai models that provide chain integration around the models.
Initialize the WatsonxLLM
instance.
Initialize the WatsonxEmbeddings
instance.
SQL Query Engine
Once we have all necessery components we can move to LlamaIndex SQL engine. For more details see LlamaIndex documentation.
SQL Database and Settings
We use SQLDatabase
wrapper for SQLAlchemy engine to interact with our local SQL database. Furthermore, we use Settings
to set the global configuration. The Settings is an object that lives throughout session and contains a set of commonly used resources that are utilized during the indexing and querying stage in a LlamaIndex pipeline.
Finally, we create a Natural language SQL Table query engine.
SQL query example
Now, we can provide textual question and based on llm response query engine will execute proper SQL query and send back to the llm to get final answer to our question.
Let's now ask what is the average age of male customers in table sales_stats
.
In response we can also check what SQL query looks like.
Checking the query:
Summary and next steps
You successfully completed this notebook!
You learned how to use LlamaIndex SQL Query Engine with WatsonxLLM
and WatsonxEmbeddings
.
Check out our Online Documentation for more samples, tutorials, documentation, how-tos, and blog posts.
Authors:
Mateusz Świtała, Software Engineer at watsonx.ai.
Copyright © 2024-2025 IBM. This notebook and its source code are released under the terms of the MIT License.