Skip to main content

RedisAI Tutorial


Profile picture for Ajeet Raina
Author:
Ajeet Raina, Former Developer Growth Manager at Redis

RedisAI is a Redis module for executing deep learning/machine learning models and managing their data. It provides tensors as a data type and deep learning model execution on CPUs and GPUs. RedisAI turns Redis Enterprise into a full-fledged deep learning runtime.The RedisAI module is seamlessly plugged into Redis. It is a scalable platform that addresses the unique requirements for both AI training and AI inference in one server. It provides a complete software platform that allows data scientists to easily deploy and manage AI solutions for enterprise applications.

The platform combines popular open source deep learning frameworks (PyTorch, ONNXRuntime, and TensorFlow), software libraries, and Redis modules like RedisGears, RedisTimeSeries, and more. With RedisAI, AI application developers no longer have to worry about tuning databases for performance. Requiring no added infrastructure, RedisAI lets you run your inference engine where the data lives, decreasing latency.

Below is an interesting example of Iris (a genus of species of flowering plants with showy flowers) classification based on measurement of width and length of sepal/petals that makes up input tensors and how to load these measurements into RedisAI:

Step 1. Installing RedisAI

 docker run \
-p 6379:6379 \
redislabs/redismod \
--loadmodule /usr/lib/redis/modules/redisai.so

You can verify if the RedisAI module is loaded or not by running the following command:

 127.0.0.1:6379> info modules
# Modules
module:name=ai,ver=10003,api=1,filters=0,usedby=[],using=[],options=[]

# ai_git
ai_git_sha:7f808a934dff121e188cb76fdfcc3eb1f9ec7cbf

# ai_load_time_configs
ai_threads_per_queue:1
ai_inter_op_parallelism:0
ai_intra_op_parallelism:0

Step 2. Setup Python Environment

Ensure that Python3.9+ is installed.

 brew install python

Create a Python virtual environment and activate it:

 python3.9 -m venv venv
. ./venv/bin/activate

Step 3. Install PIP

 pip install --upgrade pip

Step 4. Clone the repository

 git clone https://github.com/redis-developer/redisai-iris

Step 5. Install the dependencies

 pip install -r requirements.txt

Step 6. Build the TorchScript Model

RedisAI supports DL/ML identifiers and their respective backend libraries, including:

  • TF: The TensorFlow backend
  • TFLITE: The TensorFlow Lite backend
  • TORCH: The PyTorch backend
  • ONNX: ONNXRuntime backend

A complete list of supported backends is in the release notes for each version.

 python build.py

Step 7: Deploy the Model into RedisAI

A Model is a Deep Learning or Machine Learning frozen graph that was generated by some framework. The RedisAI Model data structure represents a DL/ML model that is stored in the database and can be run. Models, like any other Redis and RedisAI data structures, are identified by keys. A Model’s key is created using the AI.MODELSET command and requires the graph payload serialized as a protobuf for input.

note

This requires redis-cli. If you don't have redis-cli, the easiest way to get it is to download, build, and install Redis itself. Details can be found at the Redis quickstart page

 redis-cli -x AI.MODELSTORE iris TORCH CPU BLOB < iris.pt
OK

Step 8. Make Some Predictions

The AI.TENSORSET command stores a tensor as the value of a key.

Launch redis-cli:

 redis-cli

Step 9. Set the input tensor

This will set the key 'iris' to the 2x4 RedisAI tensor (i.e. 2 sets of inputs of 4 values each).

 AI.TENSORSET iris:in FLOAT 2 4 VALUES 5.0 3.4 1.6 0.4 6.0 2.2 5.0 1.5

where,

  • iris:in refers to the tensor's key name,
  • FLOAT is a tensor's data type
  • {5.0 3.4 1.6 0.4} refers to 1st item with 4 features
  • {6.0 2.2 5.0 1.5} refers to 2nd item with 4 features

Step 10. Display TENSORGET in BLOB format

The AI.TENSORGET command returns a tensor stored as key's value. The BLOB indicates that data is in binary format and is provided via the subsequent data argument

 redis-cli AI.TENSORGET iris:in BLOB
"\x00\x00\xa0@\x9a\x99Y@\xcd\xcc\xcc?\xcd\xcc\xcc>\x00\x00\xc0@\xcd\xcc\x0c@\x00\x00\xa0@\x00\x00\xc0?"

Step 11. Check the predictions

 redis-cli AI.TENSORGET iris:in VALUES
1) "5"
2) "3.4000000953674316"
3) "1.6000000238418579"
4) "0.40000000596046448"
5) "6"
6) "2.2000000476837158"
7) "5"
8) "1.5"

Step 12. Display TENSORGET META information

The META used with AI.TENSORGET returns the tensor's metadata as shown below:

 redis-cli AI.TENSORGET iris:in META
1) "dtype"
2) "FLOAT"
3) "shape"
4) 1) (integer) 2
2) (integer) 4

Step 13. Display TENSORGET META information with tensor values

  redis-cli AI.TENSORGET iris:in META VALUES
1) "dtype"
2) "FLOAT"
3) "shape"
4) 1) (integer) 2
2) (integer) 4
5) "values"
6) 1) "5"
2) "3.4000000953674316"
3) "1.6000000238418579"
4) "0.40000000596046448"
5) "6"
6) "2.2000000476837158"
7) "5"
8) "1.5"

Step 14. Run the model

Define inputs for the loaded model.

 redis-cli AI.MODELRUN iris INPUTS iris:in OUTPUTS iris:inferences iris:scores
OK

Step 15. Make the prediction

 redis-cli AI.TENSORGET iris:inferences VALUES META
1) "dtype"
2) "INT64"
3) "shape"
4) 1) (integer) 2
5) "values"
6) 1) (integer) 0
2) (integer) 2

References

Redis University

RedisAI Explained

RedisAI from the Command Line