Configure your prompts

Prompt Management in R2R

R2R provides a flexible system for managing prompts, allowing you to create, update, retrieve, and delete prompts dynamically. This system is crucial for customizing the behavior of language models and ensuring consistent interactions across your application.

Default Prompts

R2R comes with a set of default prompts that are loaded from YAML files located in the py/core/providers/database/prompts directory. These default prompts provide a starting point for various tasks within the R2R system.

For example, the default RAG (Retrieval-Augmented Generation) prompt is defined as follows:

1default_rag:
2 template: >
3 ## Task:
4
5 Answer the query given immediately below given the context which follows later. Use line item references to like [1], [2], ... refer to specifically numbered items in the provided context. Pay close attention to the title of each given source to ensure it is consistent with the query.
6
7
8 ### Query:
9
10 {query}
11
12
13 ### Context:
14
15 {context}
16
17
18 ### Query:
19
20 {query}
21
22
23 REMINDER - Use line item references to like [1], [2], ... refer to specifically numbered items in the provided context.
24
25 ## Response:
26 input_types:
27 query: str
28 context: str

Default Prompt Usage

This table can fall out of date, refer to the prompts directory in the R2R repository as a source of truth.
Prompt FilePurpose
default_rag.yamlDefault prompt for Retrieval-Augmented Generation (RAG) tasks. It instructs the model to answer queries based on provided context, using line item references.
graphrag_community_reports.yamlUsed in GraphRAG to generate reports about communities or clusters in the knowledge graph.
graphrag_entity_description.yaml.yamlSystem prompt for the “map” phase in GraphRAG, used to process individual nodes or edges.
graphrag_map_system.yamlSystem prompt for the “map” phase in GraphRAG, used to process individual nodes or edges.
graphrag_reduce_system.yamlSystem prompt for the “reduce” phase in GraphRAG, used to combine or summarize information from multiple sources.
graphrag_triples_extraction_few_shot.yamlFew-shot prompt for extracting subject-predicate-object triplets in GraphRAG, with examples.
hyde.yamlRelated to Hypothetical Document Embeddings (HyDE) for improving retrieval performance.
rag_agent.yamlDefines the behavior and instructions for the RAG agent, which coordinates the retrieval and generation process.
rag_context.yamlUsed to process or format the context retrieved for RAG tasks.
rag_fusion.yamlUsed in RAG fusion techniques, possibly for combining information from multiple retrieved passages.
system.yamlContains general system-level prompts or instructions for the R2R system.

You can find the full list of default prompts and their contents in the prompts directory.

Prompt Provider

R2R uses a postgres class to manage prompts. This allows for storage, retrieval, and manipulation of prompts, leveraging both a Postgres database and YAML files for flexibility and persistence.

Key features of prompts inside R2R:

  1. Database Storage: Prompts are stored in a Postgres table, allowing for efficient querying and updates.
  2. YAML File Support: Prompts can be loaded from YAML files, providing an easy way to version control and distribute default prompts.
  3. In-Memory Cache: Prompts are kept in memory for fast access during runtime.

Prompt Structure

Each prompt in R2R consists of:

  • Name: A unique identifier for the prompt.
  • Template: The actual text of the prompt, which may include placeholders for dynamic content.
  • Input Types: A dictionary specifying the expected types for any dynamic inputs to the prompt.

Managing Prompts

R2R provides several endpoints and SDK methods for managing prompts:

Adding a Prompt

To add a new prompt:

1from r2r import R2RClient
2
3client = R2RClient()
4
5response = client.prompts.add_prompt(
6 name="my_new_prompt",
7 template="Hello, {name}! Welcome to {service}.",
8 input_types={"name": "str", "service": "str"}
9)

Updating a Prompt

To update an existing prompt:

1response = client.prompts.update_prompt(
2 name="my_existing_prompt",
3 template="Updated template: {variable}",
4 input_types={"variable": "str"}
5)

Retrieving a Prompt

To get a specific prompt:

1response = client.prompts.get_prompt(
2 prompt_name="my_prompt",
3 inputs={"variable": "example"},
4 prompt_override="Optional override text"
5)

Refer directly to the Prompt API Reference for more details.

Security Considerations

Access to prompt management functions is restricted to superusers to prevent unauthorized modifications to system prompts. Ensure that only trusted administrators have superuser access to your R2R deployment.

Conclusion

R2R’s prompt management system provides a powerful and flexible way to control the behavior of language models in your application. By leveraging this system effectively, you can create more dynamic, context-aware, and maintainable AI-powered features.