LlamaIndex is another popular open source framework for building LLM applications. Like LangChain, LlamaIndex can also be used to build RAG applications by easily integrating data not built-in the LLM with LLM. There are three key tools in LlamaIndex:
- Connecting Data: connect data of any type - structured, unstructured or semi-structured - to LLM
- Indexing Data: Index and store the data
- Querying LLM: Combine the user query and retrieved query-related data to query LLM and return data-augmented answer
LlamaIndex is mainly a data framework for connecting private or domain-specific data with LLMs, so it specializes in RAG, smart data storage and retrieval, while LangChain is a more general purpose framework which can be used to build agents connecting multiple tools. The integration of the two may provide the best performant and effective solution to building real world RAG powered Llama apps.
For an example usage of how to integrate LlamaIndex with Llama 2, see
here. We also published a completed
demo app showing how to use LlamaIndex to chat with Llama 2 about live data via the you.com API.
It’s worth noting that LlamaIndex has implemented many RAG powered LLM evaluation tools to easily measure the quality of retrieval and response, including:
- Question Generation: Call LLM to auto generate questions to create an evaluation dataset.
- Faithfulness Evaluator: Evaluate if the generated answer is faithful to the retrieved context or if there’s hallucination.
- Correctness Evaluator: Evaluate if the generated answer matches the reference answer.
- Relevancy Evaluator: Evaluate if the answer and the retrieved context is relevant and consistent for the given query.