This page introduces how to build LLM-powered applications using LangChain. The overviews on this page link to procedure guides in GitHub.
What is LangChain?
LangChain is an LLM orchestration framework that helps developers build generative AI applications or retrieval-augmented generation (RAG) workflows. It provides the structure, tools, and components to streamline complex LLM workflows.
For more information about LangChain, see the Google LangChain page. For more information about the LangChain framework, see the LangChain product documentation.
LangChain components for Memorystore for Redis
Memorystore for Redis offers the following LangChain interfaces:
Learn how to use LangChain with the LangChain Quickstart for Memorystore for Redis.
Vector store for Memorystore for Redis
Vector store retrieves and stores documents and metadata from a vector database. Vector store gives an application the ability to perform semantic searches that interpret the meaning of a user query. This type of search is a called a vector search, and it can find topics that match the query conceptually. At query time, vector store retrieves the embedding vectors that are most similar to the embedding of the search request. In LangChain, a vector store takes care of storing embedded data and performing the vector search for you.
To work with vector store in Memorystore for Redis, use the RedisVectorStore
class.
For more information, see the LangChain Vector stores product documentation.
Vector store procedure guide
The Memorystore for Redis guide for vector store shows you how to do the following:
- Install the integration package and LangChain
- Initialize a vector index
- Prepare documents for the vector store
- Add documents to the vector store
- Perform a similarity search (KNN)
- Perform a range-based similarity search
- Perform a Maximal Marginal Relevance (MMR) Search
- Use the vector store as a Retriever
- Delete documents from the vector store
- Delete a Vector Index
Document loader for Memorystore for Redis
The document loader saves, loads, and deletes a LangChain Document
objects. For example, you can load data for processing into embeddings and
either store it in vector store or use it as a tool to provide specific context
to chains.
To load documents from document loader in Memorystore for Redis, use the
MemorystoreDocumentLoader
class. Use the MemorystoreDocumentSaver
class to
save and delete documents.
For more information, see the LangChain Document loaders topic.
Document loader procedure guide
The Memorystore for Redis guide for document loader shows you how to do the following:
- Install the integration package and LangChain
- Load documents from a table
- Add a filter to the loader
- Customize the connection and authentication
- Customize Document construction by specifying customer content and metadata
- How to use and customize a
MemorystoreDocumentSaver
to store and delete documents
Chat message history for Memorystore for Redis
Question and answer applications require a history of the things said in the
conversation to give the application context for answering further questions
from the user. The LangChain ChatMessageHistory
class lets the application
save messages to a database and retrieve them when needed to formulate further
answers. A message can be a question, an answer, a statement, a greeting or any
other piece of text that the user or application gives during the conversation.
ChatMessageHistory
stores each message and chains messages together for each
conversation.
Memorystore for Redis extends this class with MemorystoreChatMessageHistory
.
Chat message history procedure guide
The Memorystore for Redis guide for chat message history shows you how to:
- Install LangChain and authenticate to Google Cloud
- Initialize the
MemorystoreChatMessageHistory
class to add and delete messages