Skip to contents

Access methods to process lots of chat prompts in parallel using multisession future workers. Use this function to process lots of chat prompts simultaneously and quickly. For sequential processing, use seq_chat().

Usage

future_chat(chat_model = NULL, ...)

Arguments

chat_model

Character string specifying the chat model to use (e.g., "openai/gpt-4.1" or "anthropic/claude-3-5-sonnet-latest"). This creates an ellmer chat object using ellmer::chat().

...

Additional arguments passed to the underlying chat model (e.g., system_prompt)

Value

An R6 object with functions:

  • $process(): Function to process multiple prompts in parallel. Takes a vector or list of prompts and processes them simultaneously using multiple workers with persistent caching. Returns a process object containing results and helper functions. See ?process.future_chat for full details of the method and its parameters.

  • $register_tool(): Function to register tools that call functions to be used during chat interactions. Works the same as ellmer's $register_tool().

Examples

if (FALSE) { # interactive() && ellmer::has_credentials("openai")
# Create chat processor
chat <- future_chat("openai/gpt-4.1")

# Process prompts
response <- chat$process(
  c(
    "What is R?",
    "Explain base R versus tidyverse",
    "Explain vectors, lists, and data frames"
  )
)

# Return responses
response$texts()

# Return chat objects
response$chats()

# Check progress if interrupted
response$progress()
}