Learn how to configure and manage prompts in your R2R deployment

Prompt Management in R2R

R2R provides a flexible system for managing prompts, allowing you to create, update, retrieve, and delete prompts dynamically. This system is crucial for customizing the behavior of language models and ensuring consistent interactions across your application.

Default Prompts

R2R comes with a set of default prompts that are loaded from YAML files located in the py/core/providers/database/prompts directory. These default prompts provide a starting point for various tasks within the R2R system.

For example, the default RAG (Retrieval-Augmented Generation) prompt is defined as follows:

1default_rag:
2 template: >
3 ## Task:
4
5 Answer the query given immediately below given the context which follows later. Use line item references to like [1], [2], ... refer to specifically numbered items in the provided context. Pay close attention to the title of each given source to ensure it is consistent with the query.
6
7
8 ### Query:
9
10 {query}
11
12
13 ### Context:
14
15 {context}
16
17
18 ### Query:
19
20 {query}
21
22
23 REMINDER - Use line item references to like [1], [2], ... refer to specifically numbered items in the provided context.
24
25 ## Response:
26 input_types:
27 query: str
28 context: str

Default Prompt Usage

This table can fall out of date, refer to the prompts directory in the R2R repository as a source of truth.
Prompt FilePurpose
default_rag.yamlDefault prompt for Retrieval-Augmented Generation (RAG) tasks. It instructs the model to answer queries based on provided context, using line item references.
graphrag_community_reports.yamlUsed in GraphRAG to generate reports about communities or clusters in the knowledge graph.
graphrag_entity_description.yaml.yamlSystem prompt for the “map” phase in GraphRAG, used to process individual nodes or edges.
graphrag_map_system.yamlSystem prompt for the “map” phase in GraphRAG, used to process individual nodes or edges.
graphrag_reduce_system.yamlSystem prompt for the “reduce” phase in GraphRAG, used to combine or summarize information from multiple sources.
graphrag_triples_extraction_few_shot.yamlFew-shot prompt for extracting subject-predicate-object triplets in GraphRAG, with examples.
hyde.yamlRelated to Hypothetical Document Embeddings (HyDE) for improving retrieval performance.
rag_agent.yamlDefines the behavior and instructions for the RAG agent, which coordinates the retrieval and generation process.
rag_context.yamlUsed to process or format the context retrieved for RAG tasks.
rag_fusion.yamlUsed in RAG fusion techniques, possibly for combining information from multiple retrieved passages.
system.yamlContains general system-level prompts or instructions for the R2R system.

You can find the full list of default prompts and their contents in the prompts directory.

Prompt Provider

R2R uses a postgres class to manage prompts. This allows for storage, retrieval, and manipulation of prompts, leveraging both a PostgreSQL database and YAML files for flexibility and persistence.

Key features of prompts inside R2R:

  1. Database Storage: Prompts are stored in a PostgreSQL table, allowing for efficient querying and updates.
  2. YAML File Support: Prompts can be loaded from YAML files, providing an easy way to version control and distribute default prompts.
  3. In-Memory Cache: Prompts are kept in memory for fast access during runtime.

Prompt Structure

Each prompt in R2R consists of:

  • Name: A unique identifier for the prompt.
  • Template: The actual text of the prompt, which may include placeholders for dynamic content.
  • Input Types: A dictionary specifying the expected types for any dynamic inputs to the prompt.

Managing Prompts

R2R provides several endpoints and SDK methods for managing prompts:

Adding a Prompt

To add a new prompt:

1from r2r import R2RClient
2
3client = R2RClient()
4
5response = client.add_prompt(
6 name="my_new_prompt",
7 template="Hello, {name}! Welcome to {service}.",
8 input_types={"name": "str", "service": "str"}
9)

Updating a Prompt

To update an existing prompt:

1response = client.update_prompt(
2 name="my_existing_prompt",
3 template="Updated template: {variable}",
4 input_types={"variable": "str"}
5)

Retrieving a Prompt

To get a specific prompt:

1response = client.get_prompt(
2 prompt_name="my_prompt",
3 inputs={"variable": "example"},
4 prompt_override="Optional override text"
5)

Listing All Prompts

To retrieve all prompts:

1response = client.get_all_prompts()

Deleting a Prompt

To delete a prompt:

1response = client.delete_prompt("prompt_to_delete")

Security Considerations

Access to prompt management functions is restricted to superusers to prevent unauthorized modifications to system prompts. Ensure that only trusted administrators have superuser access to your R2R deployment.

Best Practices

  1. Version Control: Store your prompts in version-controlled YAML files for easy tracking of changes and rollbacks.
  2. Consistent Naming: Use a consistent naming convention for your prompts to make them easy to identify and manage.
  3. Input Validation: Always specify input types for your prompts to ensure that they receive the correct data types.
  4. Regular Audits: Periodically review and update your prompts to ensure they remain relevant and effective.
  5. Testing: Test prompts thoroughly before deploying them to production, especially if they involve complex logic or multiple input variables.

Advanced Usage

Dynamic Prompt Loading

R2R’s prompt system allows for dynamic loading of prompts from both the database and YAML files. This enables you to:

  1. Deploy default prompts with your application.
  2. Override or extend these prompts at runtime.
  3. Easily update prompts without redeploying your entire application.

Prompt Templating

The prompt template system in R2R supports complex string formatting. You can include conditional logic, loops, and other Python expressions within your prompts using a templating engine.

Example of a more complex prompt template:

1complex_template = """
2Given the following information:
3{% for item in data %}
4- {{ item.name }}: {{ item.value }}
5{% endfor %}
6
7Please provide a summary that {% if include_analysis %}includes an analysis of the data{% else %}lists the key points{% endif %}.
8"""
9
10client.add_prompt(
11 name="complex_summary",
12 template=complex_template,
13 input_types={"data": "list", "include_analysis": "bool"}
14)

This flexibility allows you to create highly dynamic and context-aware prompts that can adapt to various scenarios in your application.

Conclusion

R2R’s prompt management system provides a powerful and flexible way to control the behavior of language models in your application. By leveraging this system effectively, you can create more dynamic, context-aware, and maintainable AI-powered features.

For more detailed information on other aspects of R2R configuration, please refer to the following pages: