Some clients may send several queries to the server at once (for example, Apollo Client’s query batching). You can execute them concurrently with Schema#multiplex.

Multiplex runs have their own context, analyzers and instrumentation.

NOTE: As an implementation detail, all queries run inside multiplexes. That is, a stand-alone query is executed as a “multiplex of one”, so instrumentation and multiplex analyzers and tracers will apply to standalone queries run with MySchema.execute(...).

Concurrent Execution

To run queries concurrently, build an array of query options, using query: for the query string. For example:

# Prepare the context for each query:
context = {
  current_user: current_user,

# Prepare the query options:
queries = [
   query: "query Query1 { someField }",
   variables: {},
   operation_name: 'Query1',
   context: context,
   query: "query Query2 ($num: Int){ plusOne(num: $num) }",
   variables: { num: 3 },
   operation_name: 'Query2',
   context: context,

Then, pass them to Schema#multiplex:

results = MySchema.multiplex(queries)

results will contain the result for each query in queries. NOTE: The results will always be in the same order that their respective requests were sent in.

Apollo Query Batching

Apollo sends batches of queries as an array of queries. Rails’ ActionDispatch will parse the request and put the result into the _json field of the params variable. You also need to ensure that your schema can handle both batched and non-batched queries, below is an example of the default GraphqlController rewritten to handle Apollo batches:

def execute
  context = {}

  # Apollo sends the queries in an array when batching is enabled. The data ends up in the _json field of the params variable.
  # see the Apollo Documentation about query batching:
  result = if params[:_json]
    queries = params[:_json].map do |param|
        query: param[:query],
        operation_name: param[:operationName],
        variables: ensure_hash(param[:variables]),
        context: context
      operation_name: params[:operationName],
      variables: ensure_hash(params[:variables]),
      context: context

  render json: result, root: false

Validation and Error Handling

Each query is validated and analyzed independently. The results array may include a mix of successful results and failed results.

Multiplex-Level Context

You can add values to Execution::Multiplex#context by providing a context: hash:

MySchema.multiplex(queries, context: { current_user: current_user })

This will be available to instrumentation as multiplex.context[:current_user] (see below).

Multiplex-Level Analysis

You can analyze all queries in a multiplex by adding a multiplex analyzer. For example:

class MySchema < GraphQL::Schema
  # ...

The API is the same as query analyzers.

Multiplex analyzers may return AnalysisError to halt execution of the whole multiplex.

Multiplex Tracing

You can add hooks for each multiplex run with trace modules.

The trace module may implement def execute_multiplex(multiplex:) which yields to allow the multiplex to execute. See Execution::Multiplex for available methods.

For example:

# Count how many queries are in the multiplex run:
module MultiplexCounter
  def execute_multiplex(multiplex:)"Multiplex size: #{multiplex.queries.length}")

# ...

class MySchema < GraphQL::Schema
  # ...
  trace_with(MultiplexCounter )

Now, MultiplexCounter#execute_multiplex will be called for each execution, logging the size of each multiplex.