Cursors and streaming

By default prepared query execution fetches all rows, you can use a Cursor to control the amount of rows you want to read:

connection.prepare("SELECT * FROM users WHERE first_name LIKE $1", ar0 -> {
  if (ar0.succeeded()) {
    PreparedStatement pq = ar0.result();

    // Cursors require to run within a transaction
    connection.begin(ar1 -> {
      if (ar1.succeeded()) {
        Transaction tx = ar1.result();

        // Create a cursor
        Cursor cursor = pq.cursor(Tuple.of("julien"));

        // Read 50 rows
        cursor.read(50, ar2 -> {
          if (ar2.succeeded()) {
            RowSet<Row> rows = ar2.result();

            // Check for more ?
            if (cursor.hasMore()) {
              // Repeat the process...
            } else {
              // No more rows - commit the transaction
              tx.commit();
            }
          }
        });
      }
    });
  }
});

Cursors shall be closed when they are released prematurely:

cursor.read(50, ar2 -> {
  if (ar2.succeeded()) {
    // Close the cursor
    cursor.close();
  }
});

A stream API is also available for cursors, which can be more convenient, specially with the Rxified version.

connection.prepare("SELECT * FROM users WHERE first_name LIKE $1", ar0 -> {
  if (ar0.succeeded()) {
    PreparedStatement pq = ar0.result();

    // Streams require to run within a transaction
    connection.begin(ar1 -> {
      if (ar1.succeeded()) {
        Transaction tx = ar1.result();

        // Fetch 50 rows at a time
        RowStream<Row> stream = pq.createStream(50, Tuple.of("julien"));

        // Use the stream
        stream.exceptionHandler(err -> {
          System.out.println("Error: " + err.getMessage());
        });
        stream.endHandler(v -> {
          tx.commit();
          System.out.println("End of stream");
        });
        stream.handler(row -> {
          System.out.println("User: " + row.getString("last_name"));
        });
      }
    });
  }
});

The stream read the rows by batch of 50 and stream them, when the rows have been passed to the handler, a new batch of 50 is read and so on.

The stream can be resumed or paused, the loaded rows will remain in memory until they are delivered and the cursor will stop iterating.